@misk@sopuli.xyz to Technology@lemmy.worldEnglish • 6 months agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square236fedilinkarrow-up1903arrow-down119
arrow-up1884arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.co@misk@sopuli.xyz to Technology@lemmy.worldEnglish • 6 months agomessage-square236fedilink
minus-square@Waluigis_Talking_Buttplug@lemmy.worldlinkfedilinkEnglish7•6 months agoThat’s not how it works, it’s not one word that’s banned and you can’t work around it by tricking the AI. Once it starts to repeat a response, it’ll stop and give a warning.
minus-squarefirecatlinkfedilink-1•6 months agoThen don’t make it repeated and command it to make new words.
minus-squareTurunlinkfedilinkEnglish4•6 months agoYes, if you don’t perform the attack it’s not a service violation.
minus-square@Waluigis_Talking_Buttplug@lemmy.worldlinkfedilinkEnglish1•6 months agodeleted by creator
That’s not how it works, it’s not one word that’s banned and you can’t work around it by tricking the AI. Once it starts to repeat a response, it’ll stop and give a warning.
Then don’t make it repeated and command it to make new words.
Yes, if you don’t perform the attack it’s not a service violation.
deleted by creator