@[email protected] to [email protected]English • 1 year agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square229fedilinkarrow-up1898
arrow-up1898external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.co@[email protected] to [email protected]English • 1 year agomessage-square229fedilink
minus-square@[email protected]linkfedilinkEnglish5•1 year agoDid you even read the explanation part of the article??? Thanks for the grammar correction while ignoring literally all context though. You certainly put me in my place milord.
minus-square@[email protected]linkfedilinkEnglish14•1 year agoWhat’s your beef with Google researchers probing the safety mechanisms of the SotA model? How was that evil?
minus-square@[email protected]linkfedilinkEnglish3•1 year agoNow that Google spilled the beans WilliamTheWicked can no longer extract contact information of females from the ChatGPT training data.
Did you even read the explanation part of the article???
Thanks for the grammar correction while ignoring literally all context though. You certainly put me in my place milord.
What’s your beef with Google researchers probing the safety mechanisms of the SotA model?
How was that evil?
Now that Google spilled the beans WilliamTheWicked can no longer extract contact information of females from the ChatGPT training data.