Nightshade Launches: A Tool for Artists to Protect Their Work from AI Models
Months after its announcement, Nightshade, a free software tool designed for artists, is now available for download. This innovative application allows artists to “poison” AI models that attempt to train on their works.
Developed by computer scientists at the University of Chicago's Glaze Project under Professor Ben Zhao, Nightshade employs the open-source machine learning framework PyTorch. It identifies the contents of an image and subtly alters it at the pixel level, resulting in a version that appears unchanged to human eyes but looks entirely different to AI programs.
Building on Previous Innovations
Nightshade follows the launch of Glaze, a tool introduced nearly a year ago, which alters digital artwork to mislead AI training algorithms about the style of the image. While Glaze is a defensive tool—recommended for artists to use alongside Nightshade—Nightshade takes an offensive approach.
When an AI model is trained on images modified with Nightshade, it may erroneously categorize future images, leading to bizarre outputs. For example, an AI might misinterpret an image of a cow as a handbag.
Requirements and Functionality
To use Nightshade, artists need a Mac with Apple chips (M1, M2, or M3) or a PC running Windows 10 or 11. Downloads for both operating systems are available. Due to high demand, some users have experienced lengthy download times, with files sized 255MB for Mac and 2.6GB for PC.
Before downloading, users must accept the team’s end-user license agreement (EULA), which restricts modifications and commercial reproduction of the software.
Nightshade transforms images into "poison" samples, causing AI models trained on them to produce unpredictable outcomes. For instance, an image altered with Nightshade might lead to a model generating a handbag when asked for a cow in a space setting.
The tool is resilient to common image transformations—cropping, resampling, and compression do not diminish its efficacy. The algorithm's modifications are not simply watermarks but subtly integrated changes that endure through various alterations.
Mixed Reactions
While many artists, including Kelly McKernan, a lead plaintiff in a class-action copyright lawsuit against AI art companies, have embraced Nightshade, some critics argue it resembles a cyberattack on AI models. The Nightshade team clarifies its intent: to raise the cost of training on unlicensed data, ultimately encouraging AI developers to obtain licenses from artists.
The Context of Data Scraping
The rise of Nightshade is tied to growing concerns over data scraping, where AI image generators extract data from the internet without artists' consent. This practice endangers artists’ livelihoods, compelling them to find protective measures against unauthorized use of their works.
The fight over data scraping isn't new; it has previously drawn scrutiny for its implications for copyright and fair use. Although AI companies provide "opt-out" options, the Glaze/Nightshade team argues these can easily be ignored, making enforcement nearly impossible.
Nightshade aims to level the playing field by imposing costs on unauthorized scraping endeavors, prompting a rethink of business practices among AI developers. However, it’s essential to note that Nightshade cannot erase past infringements; it only aims to mitigate future ones.
In conclusion, Nightshade represents a pivotal step in the ongoing struggle for artists’ rights in the age of AI.