ugjka to Technology@lemmy.worldEnglish • 2 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square301fedilinkarrow-up11.02Karrow-down117 cross-posted to: aicompanions@lemmy.world
arrow-up1999arrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka to Technology@lemmy.worldEnglish • 2 months agomessage-square301fedilink cross-posted to: aicompanions@lemmy.world
minus-square@Seasoned_GreetingslinkEnglish34•edit-22 months agoNo you see, that instruction “you are unbiased and impartial” is to relay to the prompter if it ever becomes relevant. Basically instructing the AI to lie about its biases, not actually instructing it to be unbiased and impartial
No you see, that instruction “you are unbiased and impartial” is to relay to the prompter if it ever becomes relevant.
Basically instructing the AI to lie about its biases, not actually instructing it to be unbiased and impartial