readbeanicecream to technologyEnglish · 7 months agoResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comexternal-linkmessage-square0fedilinkarrow-up111arrow-down10cross-posted to: technology@lemmy.world
arrow-up111arrow-down1external-linkResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comreadbeanicecream to technologyEnglish · 7 months agomessage-square0fedilinkcross-posted to: technology@lemmy.world