ugjka to [email protected]English • 1 year agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square299fedilinkarrow-up11.02K
arrow-up11.02Kexternal-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka to [email protected]English • 1 year agomessage-square299fedilink
minus-square@[email protected]linkfedilinkEnglish4•1 year agoTried to use it a bit more but it’s too smart…
minus-square@[email protected]linkfedilinkEnglish18•1 year agoThat limit isn’t controlled by the AI, it’s a layer on top.
minus-squareZerlynalinkfedilinkEnglish3•1 year agoYep, it didn’t like my baiting questions either and I got the same thing. Six days my ass.
Tried to use it a bit more but it’s too smart…
That limit isn’t controlled by the AI, it’s a layer on top.
Yep, it didn’t like my baiting questions either and I got the same thing. Six days my ass.