Lawmakers seek to force tech companies to remove deepfake pornography
A new bill was introduced in Congress Wednesday by a bipartisan group of senators led by Sen. Ted Cruz, R-Texas, to mandate that technology and social media platforms remove any deepfake pornography within 48 hours of its detection.
This bipartisan initiative seeks to provide a more robust legal framework to combat the emerging threat of digitally manipulated abusive content.
The TAKE IT DOWN Act proposes stricter penalties for individuals and tech companies that fail to remove harmful content within 48 hours upon a victim's request. Individuals found guilty of creating and distributing deepfake pornography could face up to two years in prison for images of adults and up to three years for images involving children.
Failure by a company to diligently comply could be classified as either an unfair or deceptive practice under the Federal Trade Commission Act.
Led by Cruz, the bill is also sponsored by Sen. Amy Klobuchar, D-Minn., Cynthia Lummis, R-Wyo., Richard Blumenthal, D-Conn., Shelley Moore Capito, R-W.Va., Jacky Rosen, D-Nev., Ted Budd, R-N.C., Laphonza Butler, D-Calif., Todd Young. R-Ind., Joe Manchin, I-W.Va., John Hickenlooper, D-Colo., Bill Cassidy, R-La., and Martin Heinrich. D-N.M.
“In recent years, we’ve witnessed a stunning increase in exploitative sexual material online, largely due to bad actors taking advantage of newer technologies like generative artificial intelligence," Cruz said in a statement.
"Many women and girls are forever harmed by these crimes, having to live with being victimized again and again. ... While some states provide legal remedies for victims of non-consensual intimate imagery, states would be further supported by a uniform federal statute that aids in removing and prosecuting the publication of non-consensual intimate images nationwide."
Cruz believes the bill would put the "responsibility on websites to have in place procedures to remove these images."
Praising the bill, Melissa Henson, vice president of the Parents Television and Media Council, said in a statement there is an urgent need for this legislation.
"Children and teens are becoming the targets of deepfake pornography. Right now, tech and social media platforms aren't prioritizing taking down this kind of content, leaving families nowhere to turn for justice," she said.
Deepfake content uses artificial intelligence to create or alter video content so convincingly that it can be difficult to detect as fake.
"It is appalling that children are being subjected to this kind of abuse, which is magnified as it spreads on the internet. The tech industry must confront this and the 'Take it Down Act' will ensure that tech has accountability for acting to remove deepfake pornography," Henson added.
Echoing the concerns, other voices across the political and technological spectrum have pointed out the inadequacies in the existing mechanisms to handle such abuses.
The encrypted messaging platform Telegram was listed on the National Center on Sexual Exploitation's annual Dirty Dozen List of mainstream facilitators of sexual exploitation due to lax content moderation policies concerning image-based sexual abuse, including deepfake pornography and nonconsensual distribution of explicit material.
The issue has personally touched politicians such as Rep. Alexandria Ocasio-Cortez, D-N.Y., who revealed her own disturbing encounter with deepfake pornography.
While discussing legislation with aides, she came across an AI-generated image depicting her in a sexually explicit act.
"There's a shock to seeing images of yourself that someone could think are real," Ocasio-Cortez told Rolling Stone. "As a survivor of physical sexual assault, it adds a level of dysregulation. It resurfaces trauma."
Ocasio-Cortez is spearheading the House version of the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act of 2024, aimed at facilitating easier legal action for victims of nonconsensual AI-generated porn.
It's more real than some people believe, she said at the time, adding that it profoundly affects not only the victims but also those who view and engage with it.
Breeze Liu, a survivor of online image abuse, shared her harrowing experience with The New York Times in April.
After discovering a video of herself on Pornhub, Liu felt an overwhelming sense of despair, contemplating suicide due to the shame and violation she felt. It was one of the most devastating moments of her entire life, Liu recounted. Despite the removal of the original video, deepfakes of her continued to circulate, worsening her trauma.
Liu's struggle reflects the larger issue of tech companies' complicity in the proliferation of such content.
Nicholas Kristof, a columnist for The Times, criticized the role of search engines like Google and Bing in facilitating access to deepfake content. These companies monetize such victims, Kristof said, suggesting that more stringent regulations and amendments to Section 230 of the Communications Decency Act could hold tech companies more accountable.
The TAKE IT DOWN Act and the DEFIANCE Act seek to establish a legal framework that penalizes the creation and distribution of such content and empowers victims to seek justice.