🚀 “Nightshade”: A Tool to Protect Digital Art from AI Training
posted 24 Oct 2023
Researchers at the University of Chicago have unveiled a novel tool that empowers artists to “poison” their digital art, thwarting developers from using their work to train AI systems.
Named after the poisonous family of plants, Nightshade works by altering digital images in ways that introduce inaccuracies into the datasets used for AI training.
It essentially misleads AI systems by modifying pixels in a manner that can make them interpret a cat as a dog, or vice versa, as reported by MIT’s Technology Review.
“I’m just really grateful that we have a tool that can help return the power back to the artists for their own work,” some artists say.
Named after the poisonous family of plants, Nightshade works by altering digital images in ways that introduce inaccuracies into the datasets used for AI training.
It essentially misleads AI systems by modifying pixels in a manner that can make them interpret a cat as a dog, or vice versa, as reported by MIT’s Technology Review.
“I’m just really grateful that we have a tool that can help return the power back to the artists for their own work,” some artists say.