FEC Should Reverse Dangerous Decision To Not Regulate Deepfakes
Today, Axios reported that the Federal Election Commission (FEC) will not propose any new rules for the use of AI-generated deepfakes in political advertising this year. The news comes over a year after Public Citizen petitioned the agency for rulemaking on the issue.
Robert Weissman, co-president of Public Citizen, released the following statement in response:
“A decision by the FEC not to regulate political deepfakes would be a shameful abrogation of its responsibilities. The idea expressed by FEC Chair Sean Cooksey that the FEC should wait for deceptive fraud to occur and study its consequences before acting to prevent the fraud is preposterous.
“Political deepfakes are rushing at us, threatening to disrupt electoral integrity. They have been used widely around the world and are starting to surface in the United States. And while social media platforms have some good rules in place, Elon Musk’s recent posting of a political deepfake is a reminder that platforms cannot be trusted to self-regulate. Requiring that political deepfakes be labeled doesn’t favor any political party or candidate. It simply protects voters from fraud and chaos.
“The FEC is the nation’s election protection agency and it has authority to regulate deepfakes as part of its existing authority to prohibit fraudulent misrepresentations. It should have acted on this issue long ago, before Public Citizen petitioned for rulemaking. When we did petition, the agency should have promptly acted to put a rule in place. It still could and should reverse the wrongheaded decision that Chair Cooksey has said is imminent, and act to protect voters and our elections.
“The FEC’s refusal to do its job underscores the need for Congressional action, the importance of state action — already 20 state legislatures have acted to prevent deepfake chaos — and the need for the Federal Communications Commission to push forward with its proposal for an AI disclosure standard for political ads on TV and radio.”