In less than a week since the AI poisoning tool known as Nightshade was made available online for free, more than 250,000 people have downloaded and started to use it. Artists and creators can use the tool to tag their images at the pixel level which is undetectable to the human eye, but wreaks havoc on AI learning models. It is part of an effort known as the Glaze Project, designed to “increase the cost of training on unlicensed data and making licensing images from creators a more attractive option for AI companies.”
Aside from the emotional appeal for victimized artists feeling like they finally have a way to fight back, the idea of poisoning AI data sets could actually work because of how difficult it is to remove these poisoned images from the model once ingested. The problem is volume. As Glaze Project founder Zhao says, you “would need thousands of poisoned samples to inflict real damage on larger, more powerful models, as they are trained on billions of data samples.” So, for now, the solution may be more of a feel-good tool than a real digital protector of creative work … but it’s a start.