MIT Technology Review: This new data poisoning tool lets artists fight back against generative AI

MIT Technology Review: This new data poisoning tool lets artists fight back against generative AI. “The tool, called Nightshade, is intended as a way to fight back against AI companies that use artists’ work to train their models without the creator’s permission. Using it to ‘poison’ this training data could damage future iterations of image-generating AI models, such as DALL-E, Midjourney, and Stable Diffusion, by rendering some of their outputs useless—dogs become cats, cars become cows, and so forth.”

Leave a Reply

%d