A federal bill that would allow victims of nonconsensual sexually explicit deepfakes to sue people who create, share and receive them has unanimously passed the Senate and now moves to the House for a vote.
The Disrupt Explicit Forged Images and Non-Consensual Edits (Defiance) Act of 2024, introduced by Senate Judiciary Chair Dick Durbin, D-Ill., and Sen. Lindsay Graham, R-S.C., would create a federal civil remedy for identifiable victims of deepfake sexual abuse.
Rep. Alexandria Ocasio-Cortez, D-N.Y., who is sponsoring the legislation in the House, called the act the first federal protection for “survivors of nonconsensual deepfake pornography.”
“Over 90% of all deepfake videos made are nonconsensual sexually explicit images, and women are the targets 9 times out of 10,” Ocasio-Cortez said in a statement after the Senate passed the bill on Thursday. She has previously spoken out on being targeted with such deepfakes herself.
The House version of the bill is still being considered in committee.
Deepfakes typically refer to digitally manipulated images that falsely depict someone saying or doing something. They often take the form of sexually explicit photos and videos spread online. The material frequently merges a victim’s face with a body in a pornographic video. Generative artificial intelligence models can also create audio, videos and images that are entirely fake but look and sound realistic.
The Defiance Act defines a nonconsensual sexually explicit deepfake as a “visual depiction created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means to falsely appear to be authentic” that “depict the victim in the nude, or engaged in sexually-explicit conduct or sexual scenarios.”
The civil remedy would be enforceable against people who create or possess the deepfake with intent to distribute it, as well as people who distribute and receive it if they knew or recklessly disregarded that the victim did not consent to the deepfake. The statute of limitations for the remedy is 10 years.
The production of nonconsensual sexually explicit deepfakes has skyrocketed since 2023, first becoming popular with the likenesses of female public figures like influencers, politicians and celebrities. Cases have also sprung up at middle and high schools around the world, with teen girls frequently being victimized by their male classmates. While the trend has overwhelmingly targeted women and girls, men have also been depicted in deepfakes.
Following a high-profile incident with sexually suggestive AI-generated images of singer Taylor Swift going viral on X, formerly Twitter, multiple representatives introduced state and federal legislation to combat the issue. Durbin has been particularly vocal on the issue, sending a letter to the CEO of Google’s parent company Alphabet, Sundar Pichai, in June asking for details on how the search engine planned to address the proliferation of deepfakes in search results.
“Current laws don’t apply to deepfakes, leaving women and girls who suffer from this image-based sexual abuse without a legal remedy,” Durbin posted on X after the Defiance Act passed the Senate. “It’s time to give victims their day in court and the tools they need to fight back.”
,