"It's a watershed moment": New tool 'poisons' your art to protect it from AI
Nightshade is designed as an "offensive" tool.
It's impossible to write about AI without also mentioning the accompanying issues surrounding copyright and ethics. With generative models such as Midjourney coming under fire for allegedly being trained on artists' work without their consent, the tech is proving wildly divisive. But if you're worried about your own work informing what AI might spit out in future, help may be at hand.
A free tool from The Glaze Project is designed to help artists "poison" AI models by adding tiny pixel-level changes to their artwork. These changes might be indistinguishable to the human eye, but are apparently enough to throw off a generative AI model.
Nightshade is designed to turn "any image into a data sample that is unsuitable for model training. More precisely, Nightshade transforms images into "poison" samples, so that models training on them without consent will see their models learn unpredictable behaviours that deviate from expected norms, e.g. a prompt that asks for an image of a cow flying in space might instead get an image of a handbag floating in space."
Whereas the project's previous tool, Glaze, is designed as a defence against style mimicry, Nightshade is an "offensive" tool specifically designed to impede the effectiveness of the AI models themselves. And it's already proving a hit online. "Nightshade is the first *truly* GOOD news since the start of the rise of generative AI," one X user comments. "For the first time, this means consequences."
🌱☠️ After years of writing about respect for artists, seeing a tool like this is beyond remarkable to me; I t’s a watershed moment. https://t.co/t4h4I10DpKJanuary 20, 2024
But the Nightshade team emphasise that the tool currently isn't without limitations. "Changes made by Nightshade are more visible on art with flat colours and smooth backgrounds. Because Nightshade is about disrupting models, lower levels of intensity/poison do not have negative consequences for the image owner. Thus we have included a low intensity setting for those interested in prioritizing the visual quality of the original image."
The Nightshade user guide provides step-by-step instructions for downloading Nightshade on Windows or Mac (users of the latter will need an 'M' chip model). For more on why AI is causing controversy in the art community, here's how the tech is impacting designers.
Get the Creative Bloq Newsletter
Daily design news, reviews, how-tos and more, as picked by the editors.
Thank you for reading 5 articles this month* Join now for unlimited access
Enjoy your first month for just £1 / $1 / €1
*Read 5 free articles per month without a subscription
Join now for unlimited access
Try first month for just £1 / $1 / €1
Daniel John is Design Editor at Creative Bloq. He reports on the worlds of design, branding and lifestyle tech, and has covered several industry events including Milan Design Week, OFFF Barcelona and Adobe Max in Los Angeles.