LegisTrack
Back to all bills
HR 5272119th CongressIn Committee

Protect Elections from Deceptive AI Act

Introduced: Sep 10, 2025
Sponsor: Rep. Johnson, Julie [D-TX-32] (D-Texas)
Civil Rights & JusticeTechnology & Innovation
Standard Summary
Comprehensive overview in 1-2 paragraphs

The Protect Elections from Deceptive AI Act would add a new prohibition to the Federal Election Campaign Act aimed at preventing the distribution of materially deceptive AI-generated audio or visual media about federal election candidates before elections. The bill defines what counts as deceptive AI media, who is covered, and the intent required for a violation (to influence an election or solicit funds). It creates narrowly tailored exceptions for bona fide news broadcasts, newspapers/magazines, and satire/parody. Private individuals or entities that distribute such deceptive media in violation could face civil actions seeking injunctions, damages, and attorney’s fees, with a high standard of proof (clear and convincing). The act also treats a violation as defamation per se for purposes of defamation law. In short, the bill targets “deepfake”-style content used to mislead voters, while allowing certain traditional media to publish with disclosures, and it provides civil remedies to individuals whose likeness or voice is misrepresented in this context.

Key Points

  • 1Prohibition and scope: Prohibits knowingly distributing materially deceptive AI-generated audio or visual media of a covered individual (federal candidate) prior to an election, if done with intent to influence an election or solicit funds.
  • 2Definitions:
  • 3- Covered individual = candidate for Federal office.
  • 4- Deceptive AI-generated media = content created or altered by AI/ML that appears authentic and would mislead a reasonable person about the appearance, speech, or conduct of the candidate.
  • 5Exceptions (c):
  • 6- bona fide news broadcasts (radio/TV, streaming) with a clear disclosure about authenticity questions;
  • 7- newspapers/magazines or general-audience online publications with a clear statement that the media does not accurately represent the candidate’s speech or conduct;
  • 8- satire or parody.
  • 9Civil actions and remedies:
  • 10- Injunctive or other equitable relief to stop distribution;
  • 11- Damages (general or special) plus attorney’s fees and costs for prevailing party;
  • 12- Plaintiff bears the burden of proof by clear and convincing evidence.
  • 13Defamation link: Violation is treated as defamation per se for purposes of defamation actions.
  • 14Procedural/structural notes: Severability clause preserved; act is an amendment to FECA Title III.

Impact Areas

Primary group/area affected:- Federal candidates and their campaigns, and entities distributing election-related media, who would be protected from deceptive AI-generated depictions.Secondary group/area affected:- Media organizations (news broadcasters, newspapers, magazines, online news outlets) that engage in or publish AI-generated media, subject to the explicit disclosures required to fall within the exemptions.Additional impacts:- Civil liability framework for misrepresentation of a candidate’s voice or likeness in the context of federal elections; potential chilling or compliance considerations for content producers, platforms, and advertisers;- Interaction with existing defamation law (defamation per se for violations under this act);- Possible First Amendment considerations, given limits on speech about political figures, though the bill provides targeted exceptions for news and satire and requires disclosures to mitigate overreach.
Generated by gpt-5-nano on Oct 8, 2025