Senate Approves Bill Targeting Sexually Explicit Deepfake Content

The Senate unanimously passed the Disrupt Explicit Forged Images and Non-Consensual Edits Act (DEFIANCE) on Tuesday, empowering victims of nonconsensual intimate images created by AI—commonly known as “deepfakes”—to sue their creators for damages. This legislation allows victims of sexually explicit deepfakes to seek civil remedies against those who produced or processed the images with the intent to distribute them.

Victims identifiable in these deepfakes can receive up to $150,000 in damages, with potential awards increasing to $250,000 if the incident is linked to actual or attempted sexual assault, stalking, or harassment. The bill now awaits consideration in the House before heading to the president for signing into law.

The issue gained notable media attention earlier this year when explicit AI-generated images of Taylor Swift circulated widely on social media. Concern has also risen in schools, where high school girls have reported discovering intimate AI-generated pictures being shared among peers.

In a Senate floor speech, Majority Leader Chuck Schumer (D-NY) underscored that sexually explicit deepfakes are not a fringe issue but a widespread concern that can devastate lives. He emphasized that the DEFIANCE Act represents a crucial step in establishing necessary “guardrails” for AI technology. “AI can spearhead remarkable innovation, but we must implement protections to prevent its most harmful abuses,” Schumer stated, outlining a strategy for Senate committees to follow regarding AI legislation.

He called for the House to prioritize the DEFIANCE Act, noting that a companion bill is already in progress. With just a week and a half left before the August recess, Schumer remarked, “By passing this bill, we are signaling to victims of explicit nonconsensual deepfakes that we hear them and are taking action.”

Most people like

Find AI tools in YBX