The Nightshade tool will make it easier for artists to stop generative AI from stealing their work with a built-in ‘poison’ that causes the images to become corrupted. (Source: Adobe Stock)

Nightshade Program Adds Poison Pill to Art to Stop Generators from Plagiarizing

According to a story on gizmodo.com, a new program called Nightshade, designed to stop art theft online, might help artists prevent generative AI engines from plagiarizing their work.

“Ben Zhao, a professor of computer science at the University of Chicago and an outspoken critic of AI’s data scraping practices, told MIT Technology Review he and his team’s new tool, dubbed ‘Nightshade,’ does what it says on the tin—poisoning any model that uses images to train AI. So far, artists’ only option to combat AI companies was to sue them, or hope developers abide by an artists’ own opt-out requests.”

Zhao led the team that helped make Glaze, a tool that can create a kind of “style cloak” to mask artists’ images. Nightshade is expected to be added as a Glaze tool. The program enables users to build in manipulation at the pixel level, distorting images used by AI image generators like Stable Diffusion and SDXL.

“After the team introduced data samples into a version of SDXL, the model would start to interpret a prompt for ‘car’ as ‘cow’ instead. A dog was interpreted as a cat, while a hat was turned into a cake. Similarly, different styles came out all wonky. Prompts for a ‘cartoon’ offered art reminiscent of the 19th-century impressionists.

“It also worked to defend individual artists. If you ask SDXL to create a painting in the style of renowned Sci-Fi and fantasy artist Michael Whelan, the poisoned model creates something far less akin to their work.”

While efforts to watermark and “immunize” images have been underway for the past few years, the Nightshade tool appears to have the most promise in effectively blocking the generative art engines from succeeding in scraping new art. Unfortunately, art already scraped by engines like Dall-E 3 and SDXL won’t be affected, since they exist “unpoisoned” in the AI generators’ databanks.

read more at gizmodo.com