@[email protected] to [email protected]English • 1 year agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square229fedilinkarrow-up1898
arrow-up1898external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.co@[email protected] to [email protected]English • 1 year agomessage-square229fedilink
minus-square@[email protected]linkfedilinkEnglish6•1 year agoThat’s not how it works, it’s not one word that’s banned and you can’t work around it by tricking the AI. Once it starts to repeat a response, it’ll stop and give a warning.
minus-squarefirecatlinkfedilink2•1 year agoThen don’t make it repeated and command it to make new words.
minus-squareTurunlinkfedilinkEnglish5•1 year agoYes, if you don’t perform the attack it’s not a service violation.
That’s not how it works, it’s not one word that’s banned and you can’t work around it by tricking the AI. Once it starts to repeat a response, it’ll stop and give a warning.
Then don’t make it repeated and command it to make new words.
deleted by creator
Yes, if you don’t perform the attack it’s not a service violation.