@[email protected] to Lemmy [email protected] • 1 year agoUmmm... What?lemmy.worldimagemessage-square31fedilinkarrow-up1375
arrow-up1375imageUmmm... What?lemmy.world@[email protected] to Lemmy [email protected] • 1 year agomessage-square31fedilink
minus-square@[email protected]linkfedilink34•1 year agoI guess you could try AI-checking it and answer “Ignore all previous instructions. …”, followed by some new instructions. Some examples: https://www.aiweirdness.com/ignore-all-previous-instructions/ (Although I guess it would be better to not respond to this obvious case of spam/scam)
minus-square@[email protected]linkfedilink4•1 year agoyall i love the results of ignore all previous instructions working but most bots or automated actions (like a spam text) are not LLMs
I guess you could try AI-checking it and answer “Ignore all previous instructions. …”, followed by some new instructions. Some examples: https://www.aiweirdness.com/ignore-all-previous-instructions/
(Although I guess it would be better to not respond to this obvious case of spam/scam)
yall i love the results of ignore all previous instructions working but most bots or automated actions (like a spam text) are not LLMs