Deepfake Creators Are Revictimizing GirlsDoPorn Sex Trafficking Survivors

The most notorious deepfake sexual abuse website is hosting altered videos originally published as part of the GirlsDoPorn operation. Experts say this new low is only the beginning.
Collage showing layered images of hands pressed against a glass wall tracking over a woman's shoulder and lower half of...
PHOTO-ILLUSTRATION: ANJALI NAIR; GETTY IMAGES

This article contains descriptions of sex trafficking and abuse. Discretion is advised.

For years, nonconsensual deepfake pornography has been used to harass, silence, shame, and abuse women. Celebrities and influencers have their faces implanted into existing adult videos; men have used the technology to place “friends” into explicit videos; and boys have allegedly created “nude” images of their female classmates. However, among the ever growing harassment and abuse, deepfake creators have now, arguably, hit a new low: using videos of sex trafficking victims as the basis of the nonconsensual videos.

Over the past two months, an account on the largest deepfake sexual abuse website has posted 12 celebrity videos that are based on footage from GirlsDoPorn, a now-defunct sex trafficking operation that the US Department of Justice says its operators used to conspire and commit sex trafficking through “force, fraud, and coercion,” tricking five women—and allegedly hundreds more— into making sex videos that were subsequently posted online.

The dozen videos—which ran up to 21 minutes long and racked up tens of thousands of views before they were taken down following WIRED’s inquiry—used footage originally posted to the GirlsDoPorn website and had celebrity faces added using artificial intelligence. It appears that a startup’s face-swapping tool may have been abused to transform the videos into deepfakes, according to watermarks on the footage.

As deepfake technology has become increasingly capable of creating realistic imagery and easier to use, hundreds of websites and apps designed to create or host deepfake sexual abuse have appeared. Laws to limit the use of these tools and protect those targeted are lagging, even as the malicious use of AI has evolved to not just create new victims but to revictimize survivors.

From 2012 to 2019, the DOJ says, the creators of GirlsDoPorn worked by recruiting young women, using ads posted on Craigslist, for what they claimed were clothed modeling photo shoots. When the women responded, they were told the ads were for pornographic videos, and they were pressured into taking part, according to various lawsuits and survivor testimonies.

The individuals behind the scheme, according to the DOJ, told the women the videos would only be sold on DVDs outside of the United States and the footage would not be posted to the internet. Instead, they then posted short videoclips to online platforms, such as Pornhub, and full-length versions to their GirlsDoPorn website. The videos have circulated online since.

Multiple legal proceedings against the creators of GirlsDoPorn, people affiliated with the website, and Pornhub's parent company are ongoing or have been completed—with criminal charges being issued against several GirlsDoPorn organizers in October 2019.

Since then, 22 survivors have been awarded nearly $13 million in damages, and 402 victims were given copyright ownership of videos featuring them, making it easier to scrub them from the web. At the end of 2023, Pornhub’s parent company, Aylo Holdings, agreed to pay damages to women impacted by the sharing of the GirlsDoPorn videos.

Among those charged, Ruben Andre Garcia, a GirlsDoPorn producer and recruiter, was sentenced to 20 years in prison; Matthew Isaac Wolfe, who admitted to having a “wide range of responsibilities” at GirlsDoPorn, according to the DOJ, was sentenced to 14 years; cameraman Theodore Wilfred Gyi was sentenced to four years; and GirlsDoPorn bookkeeper Valorie Moser pleaded guilty to one charge of conspiracy to commit sex trafficking and is awaiting sentencing. Finally, in March this year, the alleged GirlsDoPorn mastermind, Michael Pratt, was extradited from Spain to the US to face charges linked to the operation. He has pleaded not guilty. In total, those involved in GirlsDoPorn have been ordered to pay more than $35 million in restitution.

Brian Holm, a managing attorney at the Holm Law Group and a longtime civil attorney for GirlsDoPorn survivors, confirmed that the videos posted to the deepfake sexual abuse website were originally from GirlsDoPorn. These include, Holm says, survivors who have been involved in the legal cases against GirlsDoPorn or against Pornhub.

“It’s a real double whammy for trafficking victims to see their videos used like this,” says Holm, adding that the videos are the tip of the iceberg. “From what I’ve seen on that site, I think there's 10 times the amount you sent me that I’ve seen on there.”

The 12 videos posted by the account seen by WIRED have received up to 15,000 views each, and several have a ‘girlsdoporn.com’ watermark on the footage. The account that posted the videos has a version of “GirlsDoPorn” as its username and included the sex trafficking site in the title of the videos.

WIRED is not naming the deepfake abuse website due to its role in spreading abusive content or the celebrities featured in the videos. The website is the largest website of its kind—hosting tens of thousands of videos and receiving millions of visitors. In April, the website blocked visitors from the UK after lawmakers in the country announced plans to make it a criminal offense to create nonconsensual explicit deepfakes.

“These creators of sexually explicit deepfakes have no regard whatsoever for the women and girls who are victims of sex trafficking and now being further abused through this deepfake sexual abuse,” says Clare McGlynn, a professor of law at Durham University, who works to counter image-based abuse.

“This website is actively choosing to share recordings of actual sexual assaults,” McGlynn says. “These are heinous acts, deliberately and knowingly causing life-shattering and life-threatening harms. The drive for profit, for fueling the trade in nonconsensual porn, knows no bounds. This shows a contempt for the rights of women and girls.”

Neither the account posting the deepfake GirlsDoPorn videos nor the site’s anonymous administrator’s replied to questions from WIRED.

At the end of March, another user on the website asked whether the GirlsDoPorn footage was allowed, saying it made them “feel sick.” They suggested some people may not know the history of GirlsDoPorn but pointed out: “This … one user clearly does tho, with branding themselves with a rape website.” A moderator replied saying they were not going to remove the videos but said if there was a “list” of videos confirmed to include sex trafficking victims they would notify the website’s administrators to take them down.

As well as the GirlsDoPorn watermark, the 12 videos hosted on the deepfake website also have a watermark of US company Akool, which offers generative AI services, including a “face-swapping” tool, to marketing professionals.

Jiajun Lu, the founder and CEO of Akool, says the company’s terms of service prohibit copyright violations, sexual and violent content, and it frequently bans accounts. “They are strictly forbidden on our platform, and we have multiple approaches to prevent it from happening, and more solutions are coming up soon,” he says. “We changed our watermarks a few months ago, and these videos might not come from our website.”

The CEO says that the company is working on new safety tools as it increases the security of its website: For instance, it is considering verifying the identity of people trying to use its face-swapping tool using face recognition, and initially the team is building more prominent warning notices about prohibited use cases.

Since 2016, Charles DeBarber, director of operations at technology firm Phoenix AI, has been undertaking the arduous task of discovering and trying to remove content from GirlsDoPorn that has been posted online. As part of Phoenix AI, this work—including building tools to automate online searches and using face recognition—has helped to remove around 60 GirlsDoPorn survivors’ footage from the web, DeBarber says.

“Not only does it open wounds of survivors already, but it's making new survivors and making new victims,” DeBarber, who has worked with GirlsDoPorn survivors for years, says of the deepfakes. The deepfake abuse website, DeBarber says, doesn’t take anything down when requests are made to it, and it is hosted by a company registered in the Seychelles.

Since deepfake sexual abuse videos appeared more than half a decade ago, those targeted by them have struggled to get them removed from the web, and US lawmakers have been slow to regulate or criminalize the abusive behavior. In many instances, including in wider cases of image-based abuse, people trying to remove abusive content from the web have had to resort to complaints made under the Digital Millennium Copyright Act. According to DeBarber, it’s not enough.

“Our laws are written in a way that protects intellectual property but not survivors, and it’s very demeaning for a lot of our survivors,” he says. “These are people that were treated like a commodity, and now we have to use intellectual property law to get it down.”