@[email protected] to [email protected]English • 2 years agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square229fedilinkarrow-up1898
arrow-up1898external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.co@[email protected] to [email protected]English • 2 years agomessage-square229fedilink
minus-square@[email protected]linkfedilinkEnglish12•2 years agoIs there any punishment for violating TOS? From what I’ve seen it just tells you that and stops the response, but it doesn’t actually do anything to your account.
minus-square@[email protected]linkfedilinkEnglish4•2 years agoShould there ever be a punishment for making a humanoid robot vomit?
Is there any punishment for violating TOS? From what I’ve seen it just tells you that and stops the response, but it doesn’t actually do anything to your account.
Should there ever be
Should there ever be a punishment for making a humanoid robot vomit?