Nightshade AI
Wednesday, 17 January 2024
Found via this thread. This person calls it “deeply troubling” but is getting a lot of pushback in the replies, deservedly.
Nightshade is a tool that artists can use to avoid their artwork from being used successfully in AI training. From their site:
Since their arrival, generative AI models and their trainers have demonstrated their ability to download any online content for model training. For content owners and creators, few tools can prevent their content from being fed into a generative AI model against their will. Opt-out lists have been disregarded by model trainers in the past, and can be easily ignored with zero consequences. They are unverifiable and unenforceable, and those who violate opt-out lists and do-not-scrape directives can not be identified with high confidence.
In an effort to address this power asymmetry, we have designed and implemented Nightshade, a tool that turns any image into a data sample that is unsuitable for model training. More precisely, Nightshade transforms images into “poison” samples, so that models training on them without consent will see their models learn unpredictable behaviors that deviate from expected norms, e.g. a prompt that asks for an image of a cow flying in space might instead get an image of a handbag floating in space.
Training LLM models on art without permission is highly unethical and it won’t stop unfortunately so whatever way people can fight back is welcome.
Artists already have a hard time making a living, taking advantage of them in this way is heinous.
I liked Jared Petty’s take
AI as an industry is an irresponsible race to profit, a rush into dangerous and damaging waters with no ethical consideration outside of wealth. It represents the worst of us.