“I felt like I didn’t have a choice in what happened to me or what happened to my body,” Sabrina Javellana, who in 2018, at age 21, won a seat on the city commission in Hallandale Beach, Fla., said. “I didn’t have any control over the one thing I’m in every day.”
Sabrina Javellana was a rising star in local politics — until deepfakes derailed her life.
“I felt like I didn’t have a choice in what happened to me or what happened to my body,” Sabrina Javellana, who in 2018, at age 21, won a seat on the city commission in Hallandale Beach, Fla., said. “I didn’t have any control over the one thing I’m in every day.”Credit...Haruka Sakaguchi for The New York Times
Coralie Kraft covers culture for The Times and other outlets. Over the course of 10 months, she spoke with 33 sources, including 11 victims, about A.I.-generated pornography.
- July 31, 2024
Most mornings, before walking into City Hall in Hallandale Beach, Fla., a small city north of Miami, Sabrina Javellana would sit in the parking lot and monitor her Twitter and Instagram accounts. After winning a seat on the Hallandale Beach city commission in 2018, at age 21, she became one of the youngest elected officials in Florida’s history. Her progressive political positions had sometimes earned her enemies: After proposing a name change for a state thoroughfare called Dixie Highway in late 2019, she regularly received vitriolic and violent threats on social media; her condemnation of police brutality and calls for criminal-justice reform prompted aggressive rhetoric from members of local law enforcement. Disturbing messages were nothing new to her.
Listen to this article
The morning of Feb. 5, 2021, though, she noticed an unusual one. “Hi, just wanted to let you know that somebody is sharing pictures of you online and discussing you in quite a grotesque manner,” it began. “He claims that he’s one of your ‘guy friends.’”
Javellana froze. Who could have sent this message? She asked for evidence, and the sender responded with pixelated screenshots of a forum thread that included photos of her. There were comments that mentioned her political career. Had her work drawn these people’s ire? Eventually, with a friend’s help, she found a set of archived pages from the notorious forum site 4chan. Most of the images were pulled from her social media and annotated with obscene, misogynistic remarks: “not thicc enough”; “I would breed her”; “no sane person would date such a stupid creature.” But one image further down the thread stopped her short. She was standing in front of a full-length mirror with her head tilted to the side, smiling playfully. She had posted an almost identical selfie, in which she wore a brown crew-neck top and matching skirt, to her Instagram account back in 2015. “It was the exact same picture,” Javellana said of the doctored image. “But I wasn’t wearing any clothes.”
There were several more. These were deepfakes: A.I.-generated images that manipulate a person’s likeness, fusing it with others to create a false picture or video, sometimes pornographic, in a way that looks authentic. Although fake explicit material has existed for decades thanks to image-editing software, deepfakes stand out for their striking believability. Even Javellana was shaken by their apparent authenticity.
“I didn’t know that this was something that happened to everyday people,” Javellana told me when I visited her earlier this year in Florida. She wondered if anyone else had seen the photos or the abusive comments online. Several of the threads even implied that people on the forum knew her. “I live in Broward County,” one comment read. “She just graduated from FIU.” Other users threatened sexual violence. In the days that followed, Javellana became increasingly fearful and paranoid. She stopped walking alone at night and started triple-checking that her doors and windows were locked before she slept. In an effort to protect her personal life, she made her Instagram private and removed photographs of herself in a bathing suit.
Discovering the images changed how Javellana operated professionally. Attending press events was part of her job, but now she felt anxious every time someone lifted their camera. She worried that public images of her would be turned into pornography, so she covered as much of her body as she could, favoring high-cut blouses and blazers. She knew she wasn’t acting rationally — people could create new deepfakes regardless of how much skin she showed in the real world — but changing her style made her feel a sense of control. If the deepfakes went viral, no one could look at how she dressed and think that she had invited this harassment.
Although she confided in a few friends in the days and weeks after discovering the images, she mostly kept the experience to herself. She lived with her mother and brother, but she couldn’t bring herself to tell them. They were a Filipino Catholic family who rarely talked about sex. Could she raise the images with them? Would they — or anyone — believe that they weren’t real? Besides, Javellana, who saw herself as her family’s protector and provider, did not want to burden them. When she came home after work, she would escape to her balcony to smoke weed and listen to music. When she cried, she muffled her sobs so no one would hear.
“I felt like I didn’t have a choice in what happened to me or what happened to my body,” Javellana said. “I didn’t have any control over the one thing I’m in every day.”
Editors’ PicksWill Free Beer Make Travelers More Responsible?So Long, Pavlova. Trifle Is the Dessert of the Summer.A Scavenger Hunt on the High Seas, With Rubber Ducks as the Reward
The only thing that made Javellana feel a measure of agency was seeking clarity on what was happening to her. At night she would sit alone in her bedroom, combing the internet for information about “intimate-image abuse,” or the non-consensual sharing of sexual images, the most well known form of which is “revenge porn.” In the 1990s and aughts, as access to camera phones and the internet became widespread, it became easy for a person to share intimate photos of another person without their consent on sites like Pornhub, Reddit and 4chan. Explicit videos followed, with people uploading hundreds of thousands of hours of non-consensual content online. Deepfakes added a new element: an individual could find themselves appearing in explicit content without ever engaging in sexual activity.
The technology first gained national attention when it was used to create misleading and false political content, such as a doctored video that Donald Trump shared on Twitter in 2019, which featured footage of Nancy Pelosi apparently slurring her words. In recent years, though, the bulk of deepfake material online has become pornographic. Though information was scant, Javellana found articles warning about the increasing threat of deepfakes. Artificial intelligence was advancing so rapidly that anyone with a computer or smartphone could access a number of apps to easily create images using individuals’ likenesses. One study cited by the Department of Homeland Security noted that, at one point, more than 90 percent of deepfake videos featured non-consensual, sexually explicit images, overwhelmingly of women. In 2021, though, the vast majority of states had no laws banning the distribution of deepfake pornography. As a result, there was very little guidance on what to do if you found that someone had created pornographic content featuring your likeness.
The day she discovered the images, Javellana contacted a member of the local police department, who referred her to the Florida Department of Law Enforcement’s cybercrime division. On the phone with the department’s legal adviser, Javellana laid out the situation and emailed a link to the 4chan images. While she waited to hear back, her search for help took her to Carrie Goldberg, a lawyer who had garnered media attention as an advocate for victims’ rights and sexual privacy after successfully litigating several high-profile revenge-porn cases. Goldberg’s firm received its first deepfake case in 2019, when an A-list celebrity sought help with porn that used her likeness. While celebrities are still targeted — there are deepfake porn videos featuring thousands of female celebrities — advancements in technology have allowed abusers to target people outside the public eye. One widely accessible A.I. app processed more than 600,000 photos of ordinary citizens in the first 15 days after its launch in 2023.
Norma Buster, Goldberg’s chief of staff, conducted a case evaluation with Javellana over the phone in February 2021. After the firm’s lawyers evaluated her story, Buster explained that Javellana’s situation had few satisfying legal solutions. The firm could send Digital Millennium Copyright Act (DMCA) notices to the sites hosting the fake material, arguing that they violated Javellana’s copyright. There was no guarantee that they would comply, though; internet forums have traditionally been uncooperative about removing content. In addition, it wasn’t clear that Javellana could claim a copyright violation, as A.I. had significantly modified the original image. Although 47 states had passed laws against intimate-image abuse, those laws didn’t always apply to deepfake cases. In short, Buster told her, there wasn’t much they could do.
When the cybercrime division called Javellana to their offices in April 2021, their news was as demoralizing as Goldberg’s. Special agents explained that there were no federal laws against creating or disseminating non-consensual explicit deepfakes. Florida didn’t have a state law preventing the creation of the material, either, so their hands were tied. The activity was not legally criminal; law enforcement could not investigate further.
As Javellana registered that the police wouldn’t be able to help her remove the images, she began to panic. She had worked hard to create a career in politics for herself and earn the respect of her older colleagues. Now she felt a surge of dread as she imagined people at City Hall scrutinizing the pictures. And what about her family, her friends, her neighbors? Even if she convinced everyone that the images were fake, would the people in her life ever look at her the same way? Would she ever have professional prospects again? Shortly before discovering the images, she had decided to sign up for that spring’s state teaching-certification exam in the hope of getting a job at one of her old schools. Now she imagined herself explaining to future employers — or members of the school board — that someone had created fake explicit images without her consent, and that the images were openly accessible on the internet. Why would anyone hire her and risk damaging the school’s reputation? But if she didn’t disclose the existence of the images and someone stumbled upon them online, she would almost certainly lose her job.
Sitting there in the cybercrimes office, she tried not to cry while explaining her fears to two agents. They listened, then said they could help her write an affidavit explaining that the photographs weren’t real. They pulled out papers and spread what looked like images of her naked body across a nearby table. They told her to sign each page. Staring at the images, Javellana saw her own face looking back at her, and felt self-conscious about the detectives inspecting the material. The printouts felt tangible in a way that the online posts had not; she imagined a stranger printing out these images and looking at “her.” Disgusted and scared, she broke down and wept while signing her name. Later that day, driving home, she was so distressed by the experience that she rear-ended another car.
Almost a year passed, and as Javellana’s attempts to protect herself foundered, her anger calcified into numbness. If there was nothing she could do to get the images off the internet, she at least wanted to erase them from her memory. She distanced herself from friends, who she feared would not understand her situation. Nobody at work knew about the deepfakes — she dreaded being known among her colleagues as “the girl with the nude photos.” She decided against taking the teaching-certification exam.
One morning in January 2022, Javellana was reading the news on her phone when she found an article about a bill introduced by the Democratic Florida state senator Lauren Book that would criminalize non-consensual deepfakes in the state. Senator Book had had her own experience with this technology. In November 2021, she was sitting at her kitchen table after dropping her children off at school when a text message from an unknown number arrived: “I have a proposition for you.” Two nude photographs of Senator Book followed; the perpetrator threatened to release them unless she gave him oral sex and $5,000 in gift cards. Subsequent research revealed numerous deepfake images of Senator Book online. Weeks later, she started drafting legislation that would become known as Senate Bill 1798 — one of the first attempts in the country to address pornographic deepfakes online.
Javellana took a screenshot of the article, highlighted the language about deepfakes and posted it to Twitter, along with a summary of her own experience. “I was traumatized last year when I learned ‘deepfake’ images of me were being made,” she typed. It was the first time she publicly acknowledged her ordeal. “I brought it to F.D.L.E. but they were unable to investigate the source as Florida law did not address this issue. Thank you @LeaderBookFL for your work on this, and I’m so sorry that this happened to you.”
Senator Book replied immediately, and a couple of days later the two women discussed the bill over the phone. It was a rare moment of connection amid what had been a frighteningly solitary experience. They began communicating regularly. “We were healing and going through this thing together,” Javellana told me. She admired the senator for going public with her story. More than that, after a year of feeling cowed, speaking with another woman about the experience of being deepfaked felt cathartic. She had found an ally.
Senator Book asked Javellana if she would consider testifying in support of the bill at a committee hearing. She hoped that the bill would pass, and stressed to Javellana that personal statements were a powerful way to pressure legislators. Javellana had attended dozens of committee hearings and knew the importance of testimony on legislative processes. On Feb. 8, 2022, she testified on behalf of S.B. 1798 in Tallahassee, Fla. Knowing she would have only a few minutes to convince the senators, Javellana had collected her points in the notes app on her phone, which she held while describing her fake nudes to political leaders and constituents. Grief overwhelmed her as she unfurled the course of events; twice she had to stop, resting a hand on her chest for comfort. “I didn’t want to cry in public, but it was so hard to talk about,” Javellana recalled.
Her testimony lasted three minutes. She had barely recovered when Republican Senator Ileana Garcia, the committee chair, commented. “It’d never dawned on me how bad this situation is,” Garcia said before pausing. “But sometimes it’s caused by us. In our journey for validation, we expose too much of ourselves.” Javellana was stunned and humiliated. Moments after her testimony, a state senator was blaming her for the synthetic images — exactly the kind of judgment that she had been fearing for the last year. After Senator Garcia finished her statement, Senator Book spoke up. “I didn’t put my images out there,” she stressed. “I didn’t parade them on social media. They were stolen from me, my family. I’ll never get them back.” Book looked at Javellana. “She will never get them back.”
In a 2022 committee hearing, Senator Lauren Book of Florida discussed a bill that would criminalize deepfakes in the state.
Javellana gave emotional testimony in support of S.B. 1798, which provides at least some protections from deepfakes in Florida. Despite this, Javellana felt safer outside the public eye, and in 2022 opted to not run for re-election to the city commission.
The bill passed in committee, 8-0. In the days after, Javellana spoke with the news media and called out 4chan users specifically as the architects behind her ordeal. That kicked up a new storm of images featuring Javellana and Senator Book. Shortly after, someone on the internet shared Javellana’s personal phone number and address. Comments directed at her described sexual assault, and one user threatened to drive to her house. Then came the nadir: Harassers exposed her mother’s name and phone number and texted her the explicit images. Javellana was horrified. Her worst fears were becoming reality. The more she talked about her experience, the more harassment she endured.
After the bill passed unanimously in Florida’s House and Senate, Gov. Ron DeSantis signed S.B. 1798 into law on June 24, 2022. At the time, four other states — Virginia, Hawaii, California and Texas — had enacted laws against the dissemination of deepfakes. Floridians would be protected against deepfake abuse — ostensibly, at least. Now that circulating such images was explicitly criminalized, the F.D.L.E. and local law enforcement could investigate victims’ claims. Those who circulate deepfakes could be charged, fined and possibly jailed. Victims could now pursue civil suits as well. Legislative solutions like this are an important step toward preventing nightmares like Javellana’s; they offer victims possible financial compensation and a sense of justice, while signaling that there will be consequences for creating non-consensual sexual imagery.
But for Javellana, the Florida bill has not been a panacea. One evening this past February, she wondered if people were still posting and commenting on images of her. To her dismay, she discovered entirely new threads and images posted in October 2023, a year after the law went into effect, and almost three years after she discovered the first deepfakes. She had hoped that her aggressors would lose interest once she left the city commission, but the images were spreading onto new sites like Discord, the popular social-messaging platform. She spent hours sifting through the new threads on her phone, scrolling late into the night. Some of the deepfakes looked like images from her personal Instagram account, which has just over 1,000 followers and has been private since 2021. She wondered if some of the deepfakes were the work of someone she knew.
Javellana ultimately decided against taking legal action. The law was not retroactive, meaning she could only pursue lawsuits for images distributed after its enactment. The sheer volume of material was daunting, and she would need to file a lawsuit for each individual post; the cost of hiring a lawyer to track down each harasser was prohibitive. If the users had posted the deepfakes from outside Florida, the new law wouldn’t even apply. Emotionally overwhelmed, Javellana decided to do the only thing she could: Leave the deepfakes in the past.
For the past year and a half, Javellana has worked as an aide to the Democratic mayor of Broward County. She decided not to run for re-election to the city commission (even though she felt confident that she would win). She likes learning about a different side of government, and she feels safer out of the public eye. Sometimes she wonders what would have happened if she had remained on the city commission; a second term could have positioned her to eventually run for higher office. But she felt that stepping down was her only option. It was hard enough being a young, vocal Democrat in a majority Republican state; the online harassment had pushed her over the edge. She dealt with chronic anxiety and often drove home from meetings sobbing.
A Department of Homeland Security report notes that “anyone with a social media profile is fair game to be faked,” and the F.B.I. has advised caution when posting personal photos or videos online. For most people, especially those like Javellana with public-facing careers, this guidance is unrealistic. Javellana felt she had no choice but to maintain a professional Instagram account with highlights from her time in office, even after she knew her images were being stolen and manipulated. This calculus — weighing professional advancement against potential harassment — is now a necessity for many women.
In January, Congress introduced a federal bill called the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or the DEFIANCE Act, which recently passed in the Senate. If enacted, the bill will allow victims to claim $150,000 in damages and file temporary restraining orders against their harassers. But experts warn that curbing deepfakes would require tech companies to police their platforms for abusive content and remove applications that could facilitate harassment. Platforms aren’t incentivized to remove that material unless they face potential financial repercussions, thanks to Section 230 of the U.S. Communications Decency Act, which absolves them of responsibility around content posted by their users. Goldberg, the attorney, emphasized that altering Section 230 to allow victims to sue would make platforms wary of hosting the content.
Amid all this, Javellana has continued to discover the existence of new images across three different platforms. As late as April 2024, she had found at least 40 threads on 4chan’s archive, each containing multiple images posted by different users. “At this point, they’re going to keep popping up,” Javellana told me, her voice weary. A successful lawsuit might offer her some financial compensation but not protect her against future material. If she fought to remove a few specific images, new content would appear in its place.
“It just never ends,” she said, her voice sticking. “I just have to accept it.”
Haruka Sakaguchi is a photographer in New York known for her documentary work focusing on cultural identity and intergenerational trauma. She is currently working on a project about the incarceration of Japanese Americans during World War II.
Narration produced by Krish SeenivasanEmma Kehlbeck and Tanya Pérez
Engineered by Zak Mouton
Explore The New York Times Magazine
- Pete Buttigieg on Trump Fever: Buttigieg went from mostly unknown to national star when he ran for president in 2020 as the mayor of South Bend, Ind. Now he might be under consideration to become Harris’s running mate.
- The Right-Wing Dream of ‘Self-Deportation’: Some conservatives have a grim proposal to make undocumented immigrants leave: Exclude their children from schools.
- John Hinckley Jr. and Political Violence: Forty-three years ago, he shot President Ronald Regan in a delusional bid for attention — one in a long line of disturbed young men who have bent the arc of the nation’s history.
- Why Host the Olympics?: The Games are a way for host cities to fast-track infrastructure and urban-redevelopment projects. But there are surprisingly many little examples galore suggesting the reverse is what tends to happen.
- How Americans Justify Political Violence: Partisan support for the killing of adversaries is much more widespread than anyone wants to admit.
Related Content
More In Magazine
- Melinda French Gates Is Ready to Take Sides
Devin Oktar Yalkin for The New York Times
- Pete Buttigieg Thinks the Trump Fever Could Break
Photo illustration by Devin Oktar Yalkin
- My Partner Told Me About His Fights With His Ex. I Think I’m on Her Side.
Illustration by Tomi Um
- Judge John Hodgman on Proper Top-Sheet Direction
Illustration by Louise Zergaeng Pomeroy
- Judge John Hodgman on Proper Garbage-Disposal Use
Illustration by Louise Zergaeng Pomeroy
- John Hinckley Jr. and the Madness of American Political Violence
Stefan Ruiz for The New York Times
Editors’ Picks
- Spoiler Alert: Here’s a Guide to the Cameos in ‘Deadpool & Wolverine’
Jay Maidment/20th Century Studios/Marvel Studios, via Associated Press
- Disney Comes for ‘Deadpool’
Jay Maidment/20th Century Studios and Marvel
- The One Thing Watermelon Experts Do to Pick Sweet Ones
Christopher Simpson for The New York Times. Food Stylist: Simon Andrews. Prop Stylist: Paige Hicks.