Bill Targeting Deepfake Porn Unanimously Passes Senate

Legislation introduces federal criminal penalties for distributing non-consensual intimate imagery.

Bill Targeting Deepfake Porn Unanimously Passes Senate
Photo of Sen. Ted Cruz, R-Texas, by Gage Skidmore.

WASHINGTON, Dec. 4, 2024 – A bipartisan bill to combat the spread of AI-generated, or deepfake, revenge pornography online unanimously passed the Senate Tuesday.

The TAKE IT DOWN Act, co-authored by Sens. Ted Cruz, R-Texas, and Amy Klobuchar, D-Minn., would make it unlawful to knowingly publish non-consensual intimate imagery, including deepfake imagery, that depict nude and explicit content in interstate commerce.

Websites would be required to implement procedures to remove non-consensual intimate imagery within 48 hours of a victim's request and take reasonable steps to prevent the spread of such content. The legislation would also empower victims to seek justice without requiring costly or retraumatizing civil actions.

“For young victims and their parents, these deepfakes are a matter requiring urgent attention and protection in law,” Cruz said in a statement. “I will continue to work with my colleagues in Washington to move this common-sense bipartisan legislation quickly through the House and to the President’s desk so it can be signed into law.”

The legislation comes as victims like 14-year-old Elliston Berry of Aledo, Texas, continue to grapple with the emotional and social fallout of AI-manipulated images. In October 2023, Berry discovered that a fellow high school student used AI to create fake explicit photos of her and eight other girls, superimposing their faces onto nude images sourced through a smartphone app. Despite warrants and pleas to platforms like Snapchat, her mother, Anna McAdams, said their attempts to remove the images were met with silence – until Sen. Cruz intervened.

Within an hour, the images were taken down. "It should not take a sitting senator getting on the phone to take these down," Cruz said.

The bill’s passage in the Senate comes just after 26 House Representatives, led by Reps. Debbie Dingell, D-Mich., and August Pfluger, R-Texas, urged seven tech companies to take stronger measures against the proliferation of deepfake pornography on their platforms on Nov. 26.

The TAKE IT DOWN Act garnered endorsements from nearly 90 organizations, including major tech companies like Google, Microsoft, Meta, and TikTok, alongside victim advocacy groups such as the Rape, Abuse & Incest National Network, known as RAINN, and the National Center for Missing and Exploited Children. Law enforcement groups, including the National Fraternal Order of Police, have also supported the legislation.

“We must provide victims of online abuse with the legal protections they need when intimate images are shared without their consent, especially now that deepfakes are creating horrifying new opportunities for abuse,” Klobuchar said in a statement

Most states have laws addressing deepfake pornography, but inconsistent enforcement leaves victims vulnerable. The TAKE IT DOWN Act adds federal criminal penalties, including potential prison time, for knowingly publishing non-consensual intimate imagery, making it a federal crime.

Popular Tags