Skip to main content

AI poisoning tool Nightshade received 250,000 downloads in 5 days: ‘beyond anything we imagined’

Smoke emanates from a PC tower and monitor in a drab office.
Credit: VentureBeat made with OpenAI DALL-E 3 via ChatGPT Plus

Join us in returning to NYC on June 5th to collaborate with executive leaders in exploring comprehensive methods for auditing AI models regarding bias, performance, and ethical compliance across diverse organizations. Find out how you can attend here.


Nightshade, a new, free downloadable tool created by computer science researchers at the University of Chicago which was designed to be used by artists to disrupt AI models scraping and training on their artworks without consent, has received 250,000 downloads in the first five days of its release.

“Nightshade hit 250K downloads in 5 days since release,” wrote the leader of the project, Ben Zhao, a professor of computer science, in an email to VentureBeat, later adding, “I expected it to be extremely high enthusiasm. But I still underestimated it…The response is simply beyond anything we imagined.”

It’s a strong start for the free tool and shows a robust appetite among some artists to protect their work from being used to train AI without consent. According to the Bureau of Labor Statistics, there are more than 2.67 million artists in the U.S. alone, but Zhao told VentureBeat that the users of Nightshade are likely even broader.

“We have not done geolocation lookups for these downloads,” Zhao wrote. “Based on reactions on social media, the downloads come from all over the globe.”

VB Event

The AI Impact Tour: The AI Audit

Join us as we return to NYC on June 5th to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.

Request an invite

Nightshade seeks to “poison” generative AI image models by altering artworks posted to the web, or “shading” them on a pixel level, so that they appear to a machine learning (ML) algorithm to contain entirely different content — a purse instead of a cow, let’s say. Trained on a few “shaded” images scraped from the web, an AI algorithm can begin to generate erroneous imagery from what a user prompts or asks.

On the Nightshade project page, Zhao and his colleagues — Shawn Shan, Wenxin Ding, Josephine Passananti, and Heather Zheng — stated they developed and released the tool to “increase the cost of training on unlicensed data, such that licensing images from their creators becomes a viable alternative.”

Shortly after Nightshade’s release on Jan. 18, 2023, the demand for concurrent downloads of it was so overwhelming to the University of Chicago’s web servers, that the creators had to add mirror links where people could download copies of it from another location on the cloud.

Meanwhile, the team’s earlier tool — Glaze, which works to prevent AI models from learning an artist’s signature “style” by subtly altering pixels so they appear to be something else to machine learning algorithms — has received 2.2 million downloads since it was released in April 2023, according to Zhao.

What’s next for the Glaze/Nightshade team?

Operating under their umbrella name, The Glaze Project, Zhao and his fellow researchers had already previously stated their intention to release a tool combining both Glaze (defensive) and Nightshade (offensive).

As for when this is coming, it will be at least a month at the soonest.

“We simply have a lot of to dos on our list right now,” Zhao wrote. “The combined version must be carefully tested otherwise ensure we don’t have surprises later. So I imagine at least a month, maybe more, for us to get comprehensive tests done.”

At the same time, The Glaze Project researchers have advocated that artists use Glaze first, then Nightshade, to both protect their style while disrupting AI model training and have been heartened to see artists doing just that, even though it is a little more cumbersome to use two separate programs.

“We warned people that we have not done full tests to understand how it works together with Glaze and that folks should wait before releasing any images with only Nightshade,” Zhao explained. “The artist community’s response was to say, ‘We will just Nightshade and Glaze in two steps, even though it takes more time and has more visible impact on the art.'”

An open-source version of Nightshade may be in the cards as well. “We will likely do an open source version at some point,” Zhao stated. “Just more time required to put out different versions.”

The project leader noted that he and his colleagues had not, nor did they expect to hear directly from, the model makers behind AI image-generating technology, such as OpenAI (DALL-E 3), Midjourney, Stability AI (Stable Diffusion) and others. VentureBeat uses some of these tools and others to create article imagery and other content.