@[email protected] to [email protected] • 2 years agoMeet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.comexternal-linkmessage-square123fedilinkarrow-up1551cross-posted to: [email protected][email protected][email protected]
arrow-up1551external-linkMeet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.com@[email protected] to [email protected] • 2 years agomessage-square123fedilinkcross-posted to: [email protected][email protected][email protected]
minus-square@[email protected]linkfedilink3•2 years agoI understand where you are coming, but most AI models are trained without the consent of those who’s work is being used. Same with Github Copilot, it’s training violated the licensing terms of various software licenses.
minus-square@[email protected]linkfedilink1•2 years agoThen the response to that is laws not vigilantism
minus-square@[email protected]linkfedilink1•2 years agoI agree, but those laws need to be enforced and there is no one doing it.
I understand where you are coming, but most AI models are trained without the consent of those who’s work is being used. Same with Github Copilot, it’s training violated the licensing terms of various software licenses.
Then the response to that is laws not vigilantism
I agree, but those laws need to be enforced and there is no one doing it.