Declaring war on deepfake porn

Print More

In October, sexually explicit photographs of a number of female students surfaced at Westfield High School in New Jersey.

A month later, teenage boy at a high school near Seattle reportedly circulated pornographic images of classmates.

Then in December, two Miami high school students were suspended for distributing nude images of fellow students.

In each of these cases, the images were created without consent. And they were fake.

Once limited to images and videos of public figures and celebrities (Taylor Swift last week became the latest prominent victim), synthetic non-consensual intimate imagery—also known as synthetic NCII or more commonly as deepfake porn—now is spreading far and wide, victimizing school students and other ordinary Americans. The proliferation is turbocharged by the increased availability of sophisticated artificial intelligence tools.

What has not changed: Overwhelmingly, the victims are female.

Laws so far have been no match for deepfake porn, but lawmakers are starting to fight back. Last fall, New York became one of fewer than a dozen states to enact legislation making it illegal to create and disseminate AI-generated explicit images without consent.

In Washington, Rep. Joe Morelle this month announced his introduction of the Preventing Deepfakes of Intimate Images Act.

“Try to imagine the horror of receiving intimate images looking exactly like you—or your daughter, or your wife, or your sister—and you can’t prove it’s not,” said Morelle. “Deepfake pornography is sexual exploitation, it’s abusive, and I’m astounded it is not already a federal crime.

This is Morelle’s third attempt since December 2022 to win passage of the legislation, and perhaps his best shot. The bill now has 21 co-sponsors in addition to Morelle and is bipartisan, with Republicans Rep. Tom Kean, R-N.J., and Michael Lawler, R-N.Y., on board. In addition, it comes after a House Oversight subcommittee hearing in the fall on deepfake advances, which underscored the risks posed by the technology.

But the bill’s chances may have improved most due to Francesca Mani’s willingness to speak out. Mani, 15, is one of the New Jersey high school students targeted in October. She has launched an informational website, appeared in national media, and joined with Morelle on Jan. 16 to voice support for his legislation.

“What happened to me and my classmates was not cool, and there’s no way I’m just going to shrug and let it slide,” said Mani, who was accompanied by her mother, Dorota. “I’m here, standing up and shouting for change, fighting for laws so no one else has to feel as lost and powerless as I did on October 20th.”

Mary Anne Franks, president of the Cyber Civil Rights Initiative at George Washington Law School, is another supporter of Morelle’s bill.

“So-called ‘deepfake’ technology has made it possible for people to create and distribute customized, realistic, sexually explicit imagery of anyone without their consent—celebrities, politicians, work colleagues, ex-girlfriends, next-door neighbors, children,” she said. “Like other forms of image-based sexual abuse, deepfake porn disproportionately affects women and girls and causes irreparable harm.”

Deepfake technology emerged only a little more than five years ago, when a Reddit user posted an algorithm leveraging existing AI to generate realistic fake videos. At first, the technology was available only in isolated corners of the internet. But soon, apps like DeepNude—which shut down in 2019 after uproar over use of the AI tool to create nude images of women—began to pop up online.

As deepfake researchers Sophie Nightingale and Hany Farid have written, increasingly powerful AI has “democratized access to previously exclusive Hollywood-grade, special effects technology.” These tools, they add, are “capable of creating faces that are indistinguishable—and more trustworthy—than real faces.”

Graphika, a New York City-based social media research firm, says generation of deepfakes has become “an automated and scaled online business that leverages a myriad of resources to monetize and market its services.”

In a recent report, Graphika said a group of nearly three dozen synthetic NCII providers had more than 24 million unique visitors to their websites in September. Powered by AI models that enable them to “easily and cheaply create photorealistic NCII at scale,” these providers of “nudify” or undressing services now “operate as a fully-fledged online industry, leveraging many of the same marketing tactics and monetization tools as established e-commerce companies.”

Morelle points to research showing that 96 percent of all deepfakes are pornographic, and they almost exclusively target women. The source of these statistics is a 2019 report by Sensity, a firm (formerly known as Deeptrace) devoted to deepfake content and detection. Sensity’s work is also cited in a Department of Homeland Security paper on the rising threat of deepfakes.

Some deepfake images—such as the photo of Pope Francis in a puffy coat that went viral last spring—are more whimsical than malicious. But the Graphika report says the potential harm from fakes images and videos includes targeted harassment campaigns, sextortion, and the generation of child sexual abuse material.

With enactment of the federal Violence Against Women Act Reauthorization Act in March 2022, victims of non-consensual disclosure of intimate images could seek civil penalties in federal court. But the protections, Morelle notes, do not extend to deepfakes.

Morelle’s legislation, an amendment to the VAMA, would prohibit the non-consensual disclosure of digitally altered intimate images and make the sharing of these images a criminal offense. An individual’s consent to creation of the image would not suffice as consent for the sharing or disclosure of the image. In addition, the measure give victims the right to take private legal action to seek relief and offer protections to preserve a plaintiff’s anonymity in civil cases.

The financial penalty for someone found guilty of violating the law could be a fine of up to $150,000 or actual damages sustained. In a criminal action, the penalty could be a fine and up to two years in prison (or 10 years in certain cases affecting government proceedings or administration of an election).

Another bill on Capitol Hill, the DEEPFAKES Accountability Act introduced by Rep. Yvette Clarke of Brooklyn, takes a different approach: It would require those who create deepfakes to watermark their content and make it a crime to fail to identify malicious deepfakes, including sexually explicit deepfakes. (Kean has authored similar legislation, the AI Labeling Act of 2023.)

The New York law, signed by Gov. Kathy Hochul four months ago, was authored by state Sen. Michelle Hinchey, a Saugerties Democrat who represents the 41st Senate District. Those found guilty would face up to a year in jail and a $1,000 fine, and as with Morelle’s bill, victims could pursue civil action against perpetrators.

Like the internet as a whole, deepfake porn—and the sites that offer tools to create it—can be found worldwide. In September, AI was used to created naked images of young girls in Spain.

Morelle’s legal strategy focuses on going after those who create and distribute fake intimate images, not the platforms.

“My legislation will finally make this dangerous practice illegal and hold perpetrators accountable,” he says.

Paul Ericson is Rochester Beacon executive editor. The Beacon welcomes comments and letters from readers who adhere to our comment policy including use of their full, real name. Submissions to the Letters page should be sent to [email protected]

One thought on “Declaring war on deepfake porn

  1. Deepfake porn will rapidly become the least of America’s AI worrries. The coming months will see an avalanche of deepfake political “porn” as Trumpublican tolls, Russian and otherwise, will flood social media with faked images, faked conversations, faked “facts” and faked “news” all designed to sway gullible voters before the election, and to spur controversy, “Stop the Steal” protests, and the inevitable violence afterwards should Trump again lose. And that’s no fake.

Leave a Reply

Your email address will not be published. Required fields are marked *