Skip to content

Senate Republican Blocks Durbin's Attempt to Tackle Nonconsensual, Sexually-Explicit Deepfakes

Sen. Cynthia Lummis (R-WY) objected to passage of the bipartisan bill, which would give power to victims and hold those responsible to account

WASHINGTON – U.S. Senate Majority Whip Dick Durbin (D-IL), Chair of the Senate Judiciary Committee, requested unanimous consent (UC) today on the Senate floor to pass his bipartisan Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024 (DEFIANCE Act), legislation that would hold accountable those responsible for the proliferation of nonconsensual, sexually-explicit “deepfake” images and videos.

Despite broad support for the legislation, U.S. Senator Cynthia Lummis (R-WY) objected.

Key Quotes from Durbin:

“The spread of these deplorable deepfakes is like a fire burning out of control.”

“What used to take extraordinary technological expertise and a lot of time can now be done at the push of a button. Countless apps can swap someone’s face onto another person’s body, or can digitally remove someone’s clothing. These apps are often advertised as harmless entertainment. But when explicit images are produced and shared without the consent of the person depicted, the harm is very real.”

“The negative consequences to the victims can be profound. Victims may draw in to silence themselves by withdrawing from online spaces and public discourse as a protective measure. They may endure threats to their employment, education, or reputation, or suffer additional criminal activity such as extortion and stalking. Some experience depression, anxiety, and a fear of being in public. And in the worst-case scenario, victims are driven to suicide.”

“I have been proud to partner with the New York Congresswoman Alexandria Ocasio-Cortez, who introduced this legislation in the House of Representatives with four Republican and four Democratic cosponsors. As you can see in both the Senate in the House, this is a bipartisan measure … Sadly, Congresswoman Ocasio-Cortez herself is a victim of explicit deepfakes. I commend her for working to create tools for victims in the fight against this despicable conduct.”

“Rep. Ocasio-Cortez recently described her own reaction to being depicted in sexual deepfakes without her consent. She said, ‘There’s a shock to seeing images of yourself that someone could think are real.’ She described how it resurfaced trauma and haunts her thoughts. Once these deepfakes are seen, they cannot be unseen. As she put it, ‘deepfakes are a way of digitizing violent humiliation against other people.’”

“In March of this year, at least 22 students at Richmond-Burton High School in McHenry County in my home state of Illinois learned they were depicted in deepfakes circulating online. One of the images was a doctored version of a photo of two female students taken at the school prom. The perpetrator digitally removed their clothes to make it appear the girls were unclothed. The prom is supposed to be a joyous rite of passage for teenagers. A happy memory kept for the rest of a person’s life. And now that memory has been stolen from these two young women.”

“Time and again, victims are told that nothing can be done to help them, because existing laws simply do not apply to deepfakes. This is not just a gap in the law. It is an omission that shows blatant disregard for the trauma of children, women, and girls who are victimized by this crime. But my DEFIANCE Act will change that and will give victims their day in court.”

“Once this bill is signed into law, victims finally will have the ability to hold civilly liable those who produce, disclose, solicit, or possess sexually-explicit deepfakes, while knowingly or recklessly disregarding that the person depicted did not consent to the conduct.”

“Congress has waited too long to act … It is past time to give victims of nonconsensual, sexually-explicit deepfakes the tools they need to fight back.”

Following Durbin’s statement, Lummis objected.

Durbin responded, saying: “I’m disappointed. Seriously disappointed. We talk about these young women and these young children being exploited, and we have bipartisan legislation before the House and the Senate. It’s important that this is characterized properly … The two issues raised by the gentlelady from Wyoming are both addressed in this bipartisan measure.”

Durbin continued, “There are people who will shake their head and say ‘Can't the Senate even address this issue – the sexual exploitation of children and young girls and attempts to ruin their lives? Can't they even agree to come up with an answer?’ We did. We have a bill that does it, and it's been stopped. We're not going to stop our efforts … This is a cause worth fighting for.”

The volume of “deepfake” content available online is increasing exponentially as the technology used to create it has become more accessible to the public. The overwhelming majority of this material is sexually explicit and is produced without the consent of the person depicted. A 2019 study found that 96 percent of deepfake videos were nonconsensual pornography.

One researcher found that:

  • The number of nonconsensual pornographic deepfake videos available online has increased ninefold since 2019;
  • Such videos have been viewed almost four billion times;
  • Monthly traffic to the top 20 deepfake sites increased by 285 percent from July 2020 to July 2023; and
  • Search engines directed 25.2 million visits to the top five most popular deepfake sites in July 2023 alone.

The bill creates a federal civil remedy for victims who are identifiable in a “digital forgery,” which is defined as a visual depiction created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means to falsely appear to be authentic. 

The bill is supported by the Center for Democracy and Technology, the National Center on Sexual Exploitation, the Sexual Violence Prevention Association, the National Women’s Law Center, My Image My Choice, PACT, Rights4Girls, and others.

Video of Durbin’s statement and rebuttal is available here.

Audio of Durbin’s statement and rebuttal is available here.

Footage of Durbin’s statement and rebuttal is available here for TV Stations.

A one-pager of the legislation can be found here