The US Senate unanimously passed a bill on Tuesday designed to hold accountable those who make or share deepfake porn. The Disrupt Explicit Forged Images and Non-Consensual Edits Act (DEFIANCE Act) would allow victims to sue those who create, share or possess AI-generated sexual images or videos using their likeness. The issue took root in the public consciousness after the infamous Taylor Swift deepfake that circulated among online lowlifes early this year.
The bill would let victims sue for up to $150,000 in damages. That number grows to $250,000 if it’s related to attempted sexual assault, stalking or harassment.
It now moves to the House, where a companion bill awaits. Rep. Alexandria Ocasio-Cortez (D-NY) sponsors the sister bill. If it passes there (which sounds likely, given the unanimous nature of the Senate’s vote), it will move to President Biden’s desk for final passage.
“There’s a shock to seeing images of yourself that someone could think are real,” Ocasio-Cortez told Rolling Stone earlier this year. “And once you’ve seen it, you’ve seen it. It parallels the same exact intention of physical rape and sexual assault, [which] is about power, domination, and humiliation. Deepfakes are absolutely a way of digitizing violent humiliation against other people.”
The bill, sponsored by Senator Dick Durbin (D-IL), lets the victims of intimate digital forgeries (deepfakes) sue for damages. It would give victims a 10-year statute of limitations, beginning either from the discovery of the content or when they turn 18 in the (even more disturbing) case of minors.
“As we know, AI plays a bigger role in our lives than ever before, and while it has many benefits, it’s also easier than ever to create sexually explicit deep fakes without a person’s consent,” Senate Majority Leader Chuck Schumer (D-NY) said on the Senate floor late Tuesday. “It is a horrible attack on someone’s privacy and dignity to have these fake images of them circulating online without recourse.”
Schumer cited Swift and Megan Thee Stallion in his floor speech as two celebrity examples who have fallen victim to the types of content the bill targets. However, The Verge notes online sexual deepfakes have affected those with much less clout (and money for lawyers) than A-list pop stars, like high school girls, some of whom have found out about contrived sexual images of them being passed around among their peers.
Fortunately, the bill stipulates that victims would have privacy protections during court proceedings and that they could recover legal costs. “It’s a grotesque practice and victims of these deep fakes deserve justice,” Schumer said.
Source link