Americas

  • United States

Asia

The genAI backlash: Should creative types poison the well?

opinion
Oct 24, 20234 mins
AppleArtificial IntelligenceGenerative AI

Apple’s traditional core market of creative users is probably among those most likely to be impacted by AI, so should they poison the well? The people behind Nightshade think so.

It is perhaps true to say that Apple’s traditional core users are probably among those most likely to be impacted by generative artificial intelligence (genAI), as these tools become better at creating images, movies, stories and more. That kind of automation strikes at the heart of the creative markets, so it’s almost inevitable there will be push back.

But do creative types really want to put up roadblocks in front of the genAI steamroller?

Why wouldn’t there be a backlash?

Since ChatGPT and generative AI hit the scene to become this year’s fastest-growing tech, we’ve already seen instances in which people’s creative work has exploited by these tools. We’ve seen copyright claims made, information leaked, and a shift as new players enter the market toward training AI using creative assets that are not protected by copyright.

With its own roots in the creative markets, Adobe has made a big thing about this, and trains the AI used to augment human creative processes using assets it has rights to use.

Garbage out with a little Nightshade

That right to profit from the work you do is fundamental to the creative industries. Sure, you can share it if you like, and yes, there remains a big divide between the “everything should be free” and “artists deserve to make a living” sides to the debate.

It’s a discussion as old as copyright itself, and the balance between access and exploitation is one that regularly gets reset. One relatively recent example is back when file sharing and Napster almost killed the music business, and iTunes saved the industry.

Now, it seems to be time for a fresh reset, and it looks as if technology may be emerging to help creatives protect their work against exploitation by genAI (or, to be more exact, the companies that own the tools). Nightshade is a tool that poisons AI training data by tricking it into making false conclusions. It consists of invisible pixels woven within creative works that the AI will detect when learning from those works.

Those poison pixels then work to undermine the learning process, causing the AI to generate useless data. MIT Technology Review explains that the results can mean dogs become cats, cats become cows.

Another way to understand this is that it means that perfectly legitimate creative assets carry a secret payload of garbage designed to cause any genAI tools using the image as training data to deliver garbage results. “Garbage in, garbage out,” as they say.

Empowering the creative industries

The report explains that the inventors of Nightshade hope to give creative people more power to protect their work through. “We propose the use of Nightshade and similar tools as a last defense for content creators against web scrapers that ignore opt-out/do-not-crawl directives and discuss possible implications for model trainers and content creators,” the developers write in their white paper detailing the research.

Will we see more tools of this kind? Almost certainly, given that content creators and publishers everywhere are already seeking to restore balance for their businesses against the impact of unconstrained, unregulated use of genAI.  

The legal and ethical considerations around authorship, copyright, ownership and authenticity are now recognized as critical to the successful use of genAI, notes Gartner analyst Issa Kerramans.

The business case for protectionism

They know — as we all do — that once all the world’s information becomes data, the only winners will be those who own the AI tools, with the value of content eviscerated for the profit of the few. That’s a real concern for individual creatives, but it also concerns any large creative publisher. Newspapers, for example, are pushing to get paid when their articles are used to train these models.

In other words, there are both moral and financial motives at play, which means genAI firms must eventually expect to pay for the creative assets they use.

It’s also easy to argue that putting some kind of brake on untrammelled use of creative assets by AI training systems is essential to preserve what little remains of diverse creative expression.

Putting humans first

Returning to the original point: it is interesting that Apple’s traditional core user base stands to be so deeply impacted by these new technologies. Perhaps one contribution its AI development teams can make to that congregation might be to develop a system-level approach to the protection of creative assets. Think of it as Nightshade on a systemic basis, applied to content as it is saved.

That might be a popular feature for the rest of us….

Please follow me on Mastodon, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.