Anti-deepfake legislation just took a major step toward becoming law
New anti-deepfake legislation, known as the Disrupt Explicit Forged Images and Non-Consensual Edits (Defiance) Act, has passed a Senate vote with unanimous consent, pushing the first of potentially many AI-focused regulations one step closer to federal law.
The bipartisan Defiance Act grants victims the right to sue individuals who "knowingly produce, distribute, or receive" nonconsensual sexually-explicit digital forgeries. It was introduced to the session by Senate Judiciary Chair Dick Durbin and Republican senator Lindsay Graham, but Democratic representative and co-leader Alexandria Ocasio-Cortez has become a figurehead of the legislation.
Tweet may have been deleted
"Today marks an important step in the fight to protect survivors of nonconsensual deepfake pornography," wrote Ocasio-Cortez in a statement following the Senate hearing. "Over 90 percent of all deepfake videos made are nonconsensual sexually explicit images, and women are the targets 9 times out of 10. The DEFIANCE Act would guarantee federal protections for survivors of nonconsensual deepfake pornography for the first time..."
Ocasio-Cortez has been the repeated subject of synthetic forgeries, herself, as well as several of her political colleagues. Just this week, a manipulated video of Vice President Kamala Harris — delivering a speech that never actually happened — recirculated on TikTok, racking up millions of views despite being debunked multiple times over the last year. And new reports from UK watchdogs found that online child sexual abuse material has proliferated online, with the help of digital forgeries created by AI.
While the Defiance Act provides a civil path toward remediation for those identified in deepfakes, many victims still hope to see criminal repercussions for creators and distributors of non-consensual synthetic forgeries. If such efforts follow the same legislative path as real nonconsensual pornography (or revenge porn), however, this may be left up to state law. The federal government has yet to establish criminal liability for nonconsensual pornography, but provided a civil path, much like the Defiance Act, through the 2022 reauthorization of the Violence Against Women Act.
In June, Senator Ted Cruz introduced the Take It Down Act, legislation aiming to criminalize the publication of both synthetic and real non-consensual intimate imagery and outline penalties for tech companies who fail to remove such content within 48 hours. The White House has taken similar aim at tech players for their role in the proliferation of deepfakes.
The Defiance Act is still being considered in the House and will be voted on at a later date.