A novel approach to the contentious copyright issues surrounding generative ai…
The new tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
Skip to main content
A novel approach to the contentious copyright issues surrounding generative ai…
The new tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
To view or add a comment, sign in
This is getting interesting. The days of unbounded frontier AI models training on scraped data are numbered. Expect a new ‘premium content’ economy to emerge over the coming years, where frontier models are going to have to start paying for quality data to train models on. #ai #machinelearning #artificialintelligence #chatgpt #openai #huggingface #anthropic #deepmind
The new tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
To view or add a comment, sign in
Content Strategy & Creative Development ⇛ AI Education ⇛ Emergent Technologies x Storytelling / xA&E xVice xAnonymous Content
The art strikes back. Love seeing a tug of war emerging over the DNA of creativity. “A new tool lets artists add invisible changes to the pixels in their art before they upload it online so that if it’s scraped into an AI training set, it can cause the resulting model to break in chaotic and unpredictable ways. The tool, called Nightshade, is intended as a way to fight back against AI companies that use artists’ work to train their models without the creator’s permission. Using it to “poison” this training data could damage future iterations of image-generating AI models, such as DALL-E, Midjourney, and Stable Diffusion, by rendering some of their outputs useless—dogs become cats, cars become cows, and so forth.” #generativeAI #ai #creators
The new tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
To view or add a comment, sign in
Passionate product leader | Envisioning the digital future | Customers ALWAYS come first, are you listening?
As the AI landscape continues to shift, the lack of governance and accountability will continue to find more solutions like this one. Interesting approach for creators to fight against their work being used without consent or compensation. Tech companies will find a way to avoid their ingestion and the cycle will continue. Interesting to watch it play out as a bystander.
The new tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
To view or add a comment, sign in
And so it begins. Awesome intention and as awesome the tool is. To note however, it is #opensource and ability to tinker is allowed. How will it look and behave if taken beyond art? #ai #modeltrust #data
The new tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
To view or add a comment, sign in
Dad - Lucky Husband- Commercial Building Automation Specialist- AI Solutions Founder - Developer - Writer
Traditional Digital Art is Dead: We have to accept that the skill and artistry of traditionally creating digital works is ultimately moot. The process is now irrelevant and there’s nothing anyone can do. It’s frustrating to see everyone miss the context here though. When Disney transitioned to creating CGI over hand-drawn art, we didn’t stop appreciating hand-drawn art...We also began appreciating QUALITY CGI. They aren’t mutually exclusive. Digital artists will adapt their work, or they won’t. Those that do will strive for beauty in a new medium, and those that don’t will continue to create beauty in the old. What we MUST talk about isnt finding a “poison pill” to stop AI…we need to take advantage of distributed ledgers and new methodologies to attribute ownership over all created works. If it’s possible to design tools that inject data into LMMs, then it’s also possible to inject and verify meta-data.
The new tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
To view or add a comment, sign in
The new tool, Nightshade, introduces a disruptive element by manipulating training data, potentially causing significant harm to image-generating AI models. This kind of tool could be used to explore vulnerabilities in AI systems or highlight the importance of robust data handling and security measures. As AI technology continues to advance, ensuring the safety and integrity of AI models and their training data becomes increasingly crucial.
The new tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
To view or add a comment, sign in
Throwing Shade on AI. This is not surprising. The same technology concept used to watermark AI generated content (like SythnID) is now being used to poison ☠️ AI models (Nightshade) that scrape human content. However, most human work is derivative. The ending of Star Wars is John Williams almost note for note Holst’s The Planets Mars. Imperial March (Chopin and Holst) Even comedians have their work “borrowed” or parodied. Weird AI is a great example. “ Fair Use!”…I hear being chanted “Fair Use!” A few companies have already stated they will protect this fair use. As someone who is an artist and musician, I can understand why Nightshade might appeal to creators. I will keep an open mind and look to incorporate different techniques and technologies. Curious to hear your thoughts. Post below. #artificialintelligence #artinai #aiandcreativity #arttech
The new tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
To view or add a comment, sign in
1 step forward, 3 steps back ?
The new tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
To view or add a comment, sign in
This is an interesting use case of data poisoning. By default, I think of data poisoning as an offensive, harmful technique. Being used as a check and balance for generative AI is important, especially if both gen AI and techniques to quell it develop in parallel.
The new tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
To view or add a comment, sign in