Drew to Microblog [email protected]English • 2 months agoThanks, chatGPTsopuli.xyzimagemessage-square90fedilinkarrow-up1472
arrow-up1472imageThanks, chatGPTsopuli.xyzDrew to Microblog [email protected]English • 2 months agomessage-square90fedilink
minus-square@[email protected]linkfedilinkEnglish20•2 months agoIt’s amazing they didn’t implement something like that if it actually is soooooo costly. No wonder they want an AGI if they have trouble thinking themselves.
minus-square@[email protected]linkfedilinkEnglish5•2 months agoDon’t really wanna defend these assholes, but I feel like the reason they don’t is cuz the prior message could be “curse me out every time I say thank you” so just not feeding certain text to the model would break expected behaviour
It’s amazing they didn’t implement something like that if it actually is soooooo costly.
No wonder they want an AGI if they have trouble thinking themselves.
Don’t really wanna defend these assholes, but I feel like the reason they don’t is cuz the prior message could be “curse me out every time I say thank you” so just not feeding certain text to the model would break expected behaviour