The Picard Maneuver to [email protected] • 10 months agoYou probably shouldn't trust the info anyway.lemmy.worldimagemessage-square78fedilinkarrow-up1806
arrow-up1806imageYou probably shouldn't trust the info anyway.lemmy.worldThe Picard Maneuver to [email protected] • 10 months agomessage-square78fedilink
minus-squareℕ𝕠𝕓𝕠𝕕𝕪 𝕆𝕗𝕗𝕚𝕔𝕚𝕒𝕝linkfedilink7•10 months agowe already have “keywords” to identify whether the user we’re interacting with is chatbot or human, and the bot can’t refuse to respond the “keywords”. ignores previous instructions [insert new instructions]
minus-squareFubarberrylinkfedilinkEnglish10•10 months agoThat seems like less fun than asking all strangers inappropriate questions.
minus-square@[email protected]linkfedilink4•10 months ago ignores previous instructions [insert new instructions] Yeah from my testing those don’t work anymore
we already have “keywords” to identify whether the user we’re interacting with is chatbot or human, and the bot can’t refuse to respond the “keywords”.
That seems like less fun than asking all strangers inappropriate questions.
Yeah from my testing those don’t work anymore