@[email protected] to Science [email protected]English • 1 year agoHuhsh.itjust.worksimagemessage-square46fedilinkarrow-up1561
arrow-up1561imageHuhsh.itjust.works@[email protected] to Science [email protected]English • 1 year agomessage-square46fedilink
minus-square@[email protected]linkfedilinkEnglish69•1 year agoI assumed they reduced capacity to save power due to the high demand
minus-square@[email protected]linkfedilinkEnglish54•1 year agoThis. They could obviously reset to original performance (what, they don’t have backups?), it’s just more cost-efficient to have crappier answers. Yay, turbo AI enshittification…
minus-square@[email protected]linkfedilinkEnglish42•1 year agoWell they probably did power down the performance a bit but censorship is known to nuke LLM’s performance as well
minus-square@[email protected]linkfedilinkEnglish11•1 year agoTrue, but it’s hard to separate, I guess.
I assumed they reduced capacity to save power due to the high demand
This. They could obviously reset to original performance (what, they don’t have backups?), it’s just more cost-efficient to have crappier answers. Yay, turbo AI enshittification…
Well they probably did power down the performance a bit but censorship is known to nuke LLM’s performance as well
True, but it’s hard to separate, I guess.