The Non-Obvious Insights Blog. Non-Obvious Insights
The Non-Obvious Insights Blog.

Dedicated To Helping Readers
Be More Interesting
Since 2004.

As Featured In:

Is Poisoning AI The Best Way for Artists to Fight Back?

In less than a week since the AI poisoning tool known as Nightshade was made available online for free, more than 250,000 people have downloaded and started to use it. Artists and creators can use the tool to tag their images at the pixel level which is undetectable to the human eye, but wreaks havoc on AI learning models. It is part of an effort known as the Glaze Project, designed to “increase the cost of training on unlicensed data and making licensing images from creators a more attractive option for AI companies.”

Aside from the emotional appeal for victimized artists feeling like they finally have a way to fight back, the idea of poisoning AI data sets could actually work because of how difficult it is to remove these poisoned images from the model once ingested. The problem is volume. As Glaze Project founder Zhao says, you “would need thousands of poisoned samples to inflict real damage on larger, more powerful models, as they are trained on billions of data samples.” So, for now, the solution may be more of a feel-good tool than a real digital protector of creative work … but it’s a start. 

The Non-Obvious Insights Newsletter. Non-Obvious Insights
Layer 97
The Non-Obvious Insights Newsletter
Layer 118

Skip the obvious and anticipate the future with our weekly newsletter. Join over 25,000 subscribers and start receiving the stories (and insights) you’ve been missing.

Newsletter Subscribe

All Books

#1 WSJ & USA Today Bestselling Author

Rohit is the author of 9 books on trends, the future of business, building a more human brand with storytelling and how to create a more diverse and inclusive world.

Contact

Have a Question or Inquiry?

Just fill out this form, and we’ll get back to you within 24 hours!

Contact

About You

What Are You Contacting Us About*:

Your Message