Close Menu
  • Home
  • Crypto News
    • Bitcoin
    • NFT News
  • Metaverse
  • Defi
  • Blockchain
  • Regulations
  • Trading

Subscribe to Updates

Get the latest creative news from FooBar about art, design and business.

What's Hot

XRP Price Prediction: Ripple Conspiracy Theories and Broken NDAs

April 26, 2026

XRP NEWS: GraniteShares Just Delayed Its 3x XRP ETF for the Fifth Time: Is the SEC Blocking Leveraged Crypto Products?

April 26, 2026

TRUMP loses $100M as memecoin market cap plummets after White House shooting incident

April 26, 2026
Facebook X (Twitter) Instagram
CredBit.com
  • Home
  • Crypto News
    • Bitcoin
    • NFT News
  • Metaverse
  • Defi
  • Blockchain
  • Regulations
  • Trading
Facebook X (Twitter) Instagram
CredBit.com
Home » New data poisoning tool would punish AI for scraping art without permission
Defi

New data poisoning tool would punish AI for scraping art without permission

October 23, 20232 Mins Read
Facebook Twitter WhatsApp Pinterest Telegram LinkedIn Tumblr Email Reddit VKontakte
New data poisoning tool would punish AI for scraping art without permission
Share
Facebook Twitter LinkedIn Pinterest Telegram Email

Researchers at the University of Chicago have developed a tool that gives artists the ability to “poison” their digital art in order to stop developers from training artificial intelligence (AI) systems on their work. 

Called “Nightshade,” named after the family of plants — some of which are known for their poisonous berries — the tool modifies images in such a way that their inclusion contaminates the data sets used to train AI with incorrect information.

According to a report from MIT’s Technology Review, Nightshade changes the pixels of a digital image in order to trick an AI system into misinterpreting it. As examples, Tech Review mentions convincing the AI that an image of a cat is a dog and vice versa.

In doing so, the AI’s ability to generate accurate and sensical outputs would theoretically be damaged. Using the above example, if a user requested an image of a “cat” from the tainted AI, they might instead get a dog labeled as a cat or an amalgamation of all the “cats” in the AI’s training set, including those that are actually images of dogs that have been modified by the Nightshade tool. 

Related: Universal Music Group enters partnership to protect artists’ rights against AI violations

One expert who viewed the work, Vitaly Shmatikov, a professor at Cornell University, opined that researchers “don’t yet know of robust defenses against these attacks” — the implication being that even robust models such as OpenAI’s ChatGPT could be at risk.

The research team behind Nightshade is led by Ben Zhao, a professor at the University of Chicago. The new tool is actually an expansion of their existing artist protection software called Glaze. In their previous work, they designed a method by which an artist could obfuscate, or “glaze,” the style of their artwork.

An artist who created a charcoal portrait, for example, could be glazed to appear to an AI system as modern art.

Examples of non-glazed and glazed AI art imitations. Image source: Shan et. al., 2023.

Per Technology Review, Nightshade will ultimately be implemented into Glaze, which is currently available free for web use or download.