@[email protected] to [email protected] • 2 years agoMeet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.comexternal-linkmessage-square123fedilinkarrow-up1551cross-posted to: [email protected][email protected][email protected]
arrow-up1551external-linkMeet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.com@[email protected] to [email protected] • 2 years agomessage-square123fedilinkcross-posted to: [email protected][email protected][email protected]
minus-square@[email protected]linkfedilink10•2 years agoI don’t think the idea is to protect specific images, it’s to create enough of these poisoned images that training your model on random free images you pull off the internet becomes risky.
I don’t think the idea is to protect specific images, it’s to create enough of these poisoned images that training your model on random free images you pull off the internet becomes risky.
Which, honestly, should be criminal.