Artists have a poisonous new weapon to fight artificial intelligence stealing their work. A new tool called Nightshade corrupts data ingested by AI, disrupting any new images generated by AI tools like Midjourney, Stable Diffusion and DALL-E.
Here, we explain what Nightshade is, the problems it’s trying to solve, the devious minds behind it — and when it will become available.
What is Nightshade?
Nightshade works by altering the pixels of an artist’s work to disrupt the output of any AI which scrapes the poisoned art. Changes are invisible to the naked eye, so the artist can post their pictures online to show off their work to other humans. But when scraped by AI, the altered pixels confuse machine learning algorithms.
Who created Nightshade?
A team at the University of Chicago led by Ben Zhao revealed Nightshade. It showed off the results of testing in a paper that proposes their Nightshade tool as a defence for artists against AI scraping.
The Nightshade team also created Glaze, a tool that allows artists to mask their art style and confuse AI.
“Power asymmetry between AI companies and content owners is ridiculous,” the Glaze team tweeted when announcing Nightshade. “The only thing you can do to avoid being sucked into a model is 1) opt-out lists, and 2) do-not-scrape directives… None of these mechanisms are enforceable, or even verifiable.”
Why create a poison pixel at all?
Artists have been some of the most vocal critics of the current crop of generative artificial intelligence systems, whether that’s text-based systems like OpenAI’s ChatGPT or AI that spits out images and video based on prompts given by the user.
And the results are amazing. Just read our guide on how to get the best AI art from Midjourney or our guide to get the best AI art from Adobe Firefly in Photoshop.
However, these systems can only produce new pictures because they’ve been “trained” on artwork or photography fed into the system. All without asking the artists, photographers and creators who made that original work, and definitely without paying them.
The wind is changing. Companies using AI-generated imagery may be opening themselves up to costly copyright disputes, as artists, writers and creators take legal action against companies like OpenAI, Meta and Stability AI for scraping copyrighted material.
Poison pixels are another option for artists to avoid their work or their style being replicated by AI without consent or compensation.
How do poison pixels work?
The aim here is simple. When a poisoned AI is prompted to generate a picture, the result looks nothing like the real thing.
Take Midjourney. It can instantly generate a realistic picture of a seashell, for example, but after 100 poisoned images are fed into the algorithm, the AI’s attempts start to look a little off.
According to the Nightshade team’s testing, the more images you feed the greater the effect. With 250 or 500 poisoned images introduced into the dataset, an AI prompted to produce a picture of a seashell could only spit out pixelated mush.
The adulterated data spreads in the AI’s system so related terms are also affected. For example, an AI infected with altered dog images would also struggle when prompted to generate an image of a puppy or husky.
What kind of art does Nightshade work on?
Nightshade works on anything from real objects to art styles. That makes it harder for an AI user to create an image in the style of a specific artist, for example.
What do artists think about Nightshade?
As part of our research for this article, we asked artists what they thought about Nightshade. “As creators, Nightshade is one of the strongest tools in our arsenal,” said Digital Illustrator Paloma McClain.
“It empowers artists and all image-makers to reclaim control over their own work. Nightshade grants us protection over our artistic voices while imposing consequences for those who attempt to scrape our work without permission.”
When will Nightshade be available?
Nightshade is yet to be released publicly. However, the University of Chicago team is considering building it into Glaze.
UPDATE: 2 November. We updated this article with a comment from Paloma McClain, a Digital Illustrator.
Nathalie Parent, Chief People Officer at Shift Technology: “HR is the conscience of an organisation”
For more than 30 years, Nathalie Parent has led global HR teams, working primarily with software companies. Today she’s Chief People Officer at Shift Technology
Amazon introduces new storage class that makes it cheaper to store rarely used files
Robot carers are real, but caregiving has bigger problems, writes Richard Trenholm in this FlashForward edition