AI Poisoning Tool Nightshade Surpasses 250,000 Downloads in Just 5 Days: 'Exceeding Our Wildest Expectations'

Nightshade: A Revolutionary Tool for Artists

Nightshade, a newly launched and free downloadable tool developed by researchers at the University of Chicago, empowers artists to protect their artworks from unauthorized AI model training. In just five days post-release, it garnered an impressive 250,000 downloads.

Ben Zhao, the project lead and computer science professor, shared in an email, “Nightshade hit 250K downloads in 5 days since release. I anticipated a high level of enthusiasm, but this response exceeded our expectations.”

This strong debut indicates a significant interest among artists in safeguarding their work. With over 2.67 million artists in the U.S. alone, as reported by the Bureau of Labor Statistics, Zhao believes the tool's user base may extend even further.

“We have not performed geolocation lookups for these downloads,” Zhao noted. “Social media reactions suggest that downloads are coming from all corners of the globe.”

How Nightshade Works and Its Popularity

Nightshade aims to “poison” generative AI image models by altering artworks shared online. This manipulation, or “shading” at the pixel level, makes the images look completely different to machine learning algorithms—think of a purse that appears as a cow. Once trained on a few shaded examples, an AI model can generate inaccurate imagery based on user prompts.

On the Nightshade project page, Zhao and his team—Shawn Shan, Wenxin Ding, Josephine Passananti, and Heather Zheng—explain their mission: to “increase the cost of training on unlicensed data, making licensing images from creators a viable alternative.”

The demand for Nightshade was so overwhelming upon its Jan. 18, 2023 release, that the University of Chicago’s servers struggled to keep up, prompting the addition of mirror links for downloads.

In addition, the team’s earlier tool, Glaze, designed to prevent AI from learning an artist’s unique style through subtle pixel alterations, has achieved 2.2 million downloads since its April 2023 release.

Future Developments from The Glaze Project

Operating under the umbrella of The Glaze Project, Zhao and his colleagues have announced plans to develop a tool that merges the defensive capabilities of Glaze with the offensive features of Nightshade. This combined tool is expected to take at least a month to release, as careful testing is essential.

“We have many tasks to complete,” Zhao stated. “We need to ensure the combined version is thoroughly tested to avoid surprises. I expect it to take at least a month, possibly longer, to complete comprehensive testing.”

The Glaze Project researchers recommend artists use Glaze before Nightshade to safeguard their unique style while disrupting AI model training. This two-step approach has proven successful among artists, despite being slightly more cumbersome.

“We advised users that we have not conducted full tests to see how the two tools work together. They should wait before using only Nightshade,” Zhao explained. “The artist community responded, ‘We will use Nightshade and Glaze in two steps, even if it takes more time and visibly alters the art.’”

An open-source version of Nightshade may be in development as well. “We will likely release an open-source version in the future; however, it requires additional time to produce.”

Zhao also indicated that they have yet to hear from the creators of AI image-generating technologies like OpenAI (DALL-E 3), Midjourney, and Stability AI (Stable Diffusion), despite their active use of these tools in content creation.

Most people like

Find AI tools in YBX