Deepfake Laws Forget Sex Workers

notion image
Unsplash / 404 Media
This segment is a paid ad. If you’re interested in advertising, let's talk.
notion image
Spam starts with your personal data being sold. Data brokers are profiting from your home address, SSN, phone number, and other personal information that doesn’t belong to them. Incogni scrubs your personal info from the web. Simple as. It takes 3 minutes to set up, tackles 180+ data brokers and people-search sites, and there’s a 30-day money back guarantee. Protect your data, your time, and your sanity. Get Incogni today and never look back.
There would be no deepfakes without sex workers—and specifically, without a disregard for their labor, boundaries, and bodily autonomy.
In the seven years since I first wrote about deepfakes, before there was even a word for the AI-powered face-swapping technology, people have finally started to realize that sexually explicit deepfakes meant to harass, blackmail, threaten, or simply disregard women’s consent have always been the primary use of the technology—not spreading disinformation or endangering democracy. This has always been the case, but now, when deepfakes capture national attention, it’s typically because a big-name celebrity has been the target (most recently, Taylor Swift). And even when it’s a lesser-known person whose face is transposed onto a nude or sexualized body, the narrative centers on that person as the sole victim.
But there are at least two people in every deepfake: the one being impersonated, whose face is being used, and the one whose face has been erased entirely, plastered over by an algorithm, leaving their body exposed. The latter is almost always a porn worker, someone who makes their living with that body and carefully chooses who to share it with in their work.
“It seems obvious, but it's something I rarely see pointed out when it comes to deepfakes,” adult performer Siri Dahl told me. “And it's something that I think any law around that should also be aware of, and hopefully addresses as well... I don't see that actually being pulled out so much in the conversation, that [nonconsensual deepfakes] are also erasing the work of a sex worker. It's like this odd combination of piracy and erasure, and also then violating the rights of a different celebrity who's not in the porn industry.”
If efforts against image-based sexual abuse, including the computer generated kind, don’t ally and collaborate with sex worker rights organizations, it does the entire movement a disservice. “As someone who works in direct victims services, my top concern is survivors—I’ve spoken with thousands of survivors of sex- and tech- facilitated abuse in my career, and many of them are sex workers or have done some form of sex work at some point in their life,” Norma Buster, chief of staff at C. A. Goldberg, PLLC, a victims' rights law firm, told me. “I really think our movement would be so much more powerful if we worked together to find solutions—because sex workers don’t want their content shared without their consent, and they don’t want to become victims of deepfakes either.”
notion image
notion image
Laws against malicious deepfakes have, so far, been up to individual states and their legislators to draft, pass, and enforce. Eleven states currently have legislation against deepfakes, and there currently are no federal laws against them specifically. But recently, there’s been a push for laws against malicious deepfakes on the federal level.
After Swift’s deepfake went viral, there were calls for Congress to take action. A bipartisan group of legislators introduced the No Artificial Intelligence Fake Replicas And Unauthorized Duplications (No AI FRAUD) Act on January 10, which would establish "a federal framework to protect Americans’ individual right to their likeness and voice against AI-generated fakes and forgeries," a press release by Rep. María Elvira Salazar says.
The Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act, introduced by Alexandria Ocasio-Cortez in March, proposes a federal civil right of action for victims of “digital forgery,” meaning someone’s likeness created using “software, machine learning, artificial intelligence, or any other computer-generated or technological means to falsely appear to be authentic,” according to the bill’s text.
In May 2023, Congressman Joe Morelle introduced the “Preventing Deepfakes of Intimate Images Act,” which similarly would make sharing “intimate digital depictions” without consent a criminal offense, but also adds a right of private action for victims to seek relief. This is meant to serve “as a powerful deterrent” from making deepfakes in the first place, a press release from Morelle’s office about the bill said.
And last month, the White House issued a call for the private sector—including payment processors, mobile app stores and developers, cloud providers, and search engines, according to the order—to “help prevent and remedy the harms of image-based sexual abuse.”
💡
Do you know anything else about deepfakes moderation and legislation, and how it impacts sex online? I would love to hear from you. Using a non-work device, you can message me securely on Signal at sam.404. Otherwise, send me an email at sam@404media.co.
Each of these legislative and policy choices have implications for all of us, including online sex workers who use the internet as their primary source of revenue, and whose work is being stolen and abused through malicious or careless nonconsensual deepfakes.
“The aim is to not have people deepfaked—that’s not something I want happening to anybody,” Alison Boden, executive director at the adult industry advocacy group the Free Speech Coalition, told me. “And also, the problem just tends to be that people write bills first and then ask questions.”
Boden, who spends a lot of time tracking and analyzing policy that affects the adult industry, said she’s not “freaked out” by the deepfake legislation being proposed currently, but is keeping a watchful eye on the landscape, and how policies could be directly problematic and also have secondary effects, as part of “the soup of anti-sex rhetoric,” she said.
Some lawmaking efforts have briefly raised alarm bells, however. When the anti-pornography movement turns its eye toward something, it’s a real threat. Lately, signs are appearing that this is happening in regards to deepfakes. Congresswoman Alexandria Ocasio-Cortez, who has defended sex workers’ rights in the past, quoted Dawn Hawkins, CEO at the National Center on Sexual Exploitation, in her press release announcing the DEFIANCE Act. Ocasio-Cortez’s chief of staff later said quoting a group in a press release does not equal endorsing or partnering with them, but NCOSE isn’t just an unusual inclusion in a progressive lawmaker’s press release; it’s part of a conservative censorship machine, and has been so influential on both public and private governance that its ability to turn moral panic into policy has been studied and proven.
Groups like NCOSE generally regard all problems as platform problems, and loudly demand ham-fisted solutions to complex issues, including shutting down Pornhub entirely, demanding Reddit and Twitter purge their platforms of all sexual content, and eradicating the adult industry altogether. In any other industry, on any topic other than sex, these would be obvious cases of throwing the baby out with the bathwater: Porn is massively popular and lucrative for these platforms, millions of people earn a livelihood from being able to advertise or sell content on them, and millions more are happily consuming that content every day.
“I do think that sometimes the conversation gets muddled when anti-porn organizations start to get involved in the fight. And that’s why it’s important to actually read through a law before deciding whether or not to support it—not just articles about it, and not just a list of endorsements,” Buster said. If a law is promoted to help victims of image based sexual abuse (IBSA) or deepfakes, she said, but its only focus is on creating regulations for the adult industry, she questions whether sex worker advocacy groups were consulted in its creation.
“But I do think it’s why those of us working in the image-based abuse space need to be intentional about not letting this become an anti-porn movement and highlighting the efforts of those who have been working to fight for all victims without throwing one group under the bus,” Buster said.
The effects on sex workers go beyond the theft of their likenesses and into impersonation and unfair moderation practices. Sex workers already struggle with catfishing accounts on social media, where someone pretends to be them to scam people out of their money, and Dahl told me that deepfakes make this problem worse—compounded by platforms’ inability or lack of desire to moderate it. Scammers do this to sex workers and other, more “safe for work” influencers alike. She’s had multiple accounts on Instagram banned, for example, but thousands of catfishing replicas of her are still active.
“If it becomes the new thing for people to use AI generated images to do this kind of thing, that's going to make it so much harder on us to try to report accounts that are impersonating sex workers,” Dahl said. “Because it won't be as obvious if someone's using AI generated imagery of me to pretend to be me, when those images look good.”
There is a large kernel of truth to the points anti-sex organizations make when it comes to abuse imagery: platforms are grievously failing victims of this form of abuse and harassment across the board. But this is the game for these conservative anti-pornography groups: find a universally concerning problem that tech is exacerbating, plus a vacuum that education, social services, or corporate accountability have failed to fill, and use that toe-hold to twist the private sector into its bidding through whipping up public opinion and legislative support.
But, like age verification and keeping minors away from adult material, as well as harm reduction and public health, the adult industry has been dealing with the issue of nonconsensually-shared material for a very long time.
“It may be the case that the way to go is right to right to publicity, or right to your likeness, where no one can [make a deepfake of you] unless you give them permission to. And that would hopefully solve problems, but it doesn't solve the problem of ‘oh, we can't tell the difference,’” Boden said. “And so I think I'm more of a fan of like, let's implement a notice and takedown scheme as opposed to let's throw people in jail.”
For example, she said, adult platforms are required by banks to have a mechanism in place for people who were depicted in content they didn’t consent to to have it taken down. Facebook and other mainstream social platforms, where a lot of nonconsensual content is shared, aren’t subject to the same kind of scrutiny from banks. “So I would want to see more options for people who don't want this content up, that are actually taken seriously by the platforms, so that we don't need to be in a place where we just cannot do any of this because it's all a problem.”
notion image
Sex workers, always on the cutting edge, are using deepfakes and AI-chatbot “clones” of themselves, consensually. As writer and adult performer Jessica Stoya recently wrote in her piece about how sex workers are using AI technology to further their work: “Sex workers, due to the constant practice of marketing ourselves, may be better suited than most to create personal artificial intelligences. Creators of AI clones must ask themselves, ‘Who am I? Who do I want to present? What little compartment of mine do I want to sell?’ This is something adult creators were doing long before the internet took off.” Performer Riley Reid told me earlier this year, in a conversation about her co-founding an AI chat companion company called Clona.ai, that she felt fortunate to have the chance to potentially extend her career through creating a digital personality clone of herself that people could talk to—and that she could have creative control over, and earn from. “I may not become extinct, because of AI,” she said.
Kate Ruane, an attorney and director of the Center for Democracy and Technology’s Free Expression Project, told me that when we’re defining deepfakes broadly—as media that presents video or audio of a real person that didn’t happen—there are serious risks, but also legitimate uses, like the ones Stoya and Reid have explored.
“When I look at the debate, I don't see enough precision in identifying risk mitigation or punishing of harm, and in particular, non-constitutionally protected harm,” Ruane said. “And that, I fear, will lead to legislation that either has a tremendous chilling effect on legitimate uses of the technology or is so broadly written as to be easily identified as unconstitutional and therefore winds up never going into effect and not actually mitigating any of the arrest any of the risks or addressing any of the harms.”
Mainstream social platforms are historically easily pressured by anti-porn groups, and almost always prefer to limit or kick sex out entirely before taking a nuanced approach to moderation and safety. They’re bad at telling the difference between AI generated images and real photos or videos, and to the contrary, are letting AI engagement bait run amok seemingly as a feature, not a bug. Relying on them to tackle the problem of malicious deepfakes is a dangerous path.
“If you do not have perfect technology to identify whatever it is we're calling a deepfake, you are going to get a lot of guessing being done by the social media companies, and you're going to get disproportionate amounts of censorship,” especially for marginalized groups, Ruane said. “For a social media company, it is not rational for them to open themselves up to that risk, right? It's simply not. And so my concern is that any video with any amount of editing, which is like every single TikTok video, is then banned for distribution on those social media sites.”
Buster said that her firm has been able to advocate for financial accountability for clients who are victims of IBSA in general, under the Violence Against Women Act. “We can foresee similar paths to justice for victims of deepfakes, who often tell of experiencing similar harms to other survivors of IBSA, with something like the DEFIANCE Act,” she said. “And it would mean sex workers could pursue civil accountability when their work is recorded and pirated without their consent to create deepfakes.”
Trying to stop deepfakes with small fixes, moderation tweaks, and even with the legal system are only Band-aids on a deeper problem of how people view sex, the labor and rights of sex workers, and the stigma and shame around sex in general. “If we aren't moving to inoculate the population against it by helping by improving media literacy, we're never going to get to a point where we've addressed the problem meaningfully, even with all of the new laws that you would want to create,” Ruane said. But that’s a much harder problem than writing a bill and a press release.