Advertisement
AI

AI-Generated Child Sexual Abuse Material Is Not a ‘Victimless Crime’

The ability to produce infinite images powered by datasets containing millions of photographs of real people, including children, and real images of real CSAM, perpetuates abuse in a way that was previously impossible. 
AI-Generated Child Sexual Abuse Material Is Not a ‘Victimless Crime’
Photo by Lukas Hellebrand on Unsplash

Last week, the FBI announced it arrested a man who allegedly used AI to generate child sexual abuse imagery. It’s a novel case because it’s one of the first instances of the FBI bringing charges against someone related to using AI to create child sexual abuse material (CSAM). Steven Anderegg is accused of creating “thousands of realistic images of prepubescent minors” he made using AI, commissioning requests from others for more images, and sending those images to a child through Instagram. He’s also accused of abusing his son.

Whenever news about AI-generated CSAM breaks, there’s a small but persistent contingent of people who wonder: Are these images even all that bad, given that they aren’t “real” photographs of “real” children? Is it not a “victimless crime?” Could it help pedophiles avoid contacting actual minors?

Advertisement