LegisTrack
Back to all bills
S 146119th CongressBecame Law

TAKE IT DOWN Act

Introduced: Jan 16, 2025
Civil Rights & JusticeTechnology & Innovation
Standard Summary
Comprehensive overview in 1-2 paragraphs

The TAKE IT DOWN Act would add federal criminal penalties for the intentional publication of nonconsensual intimate visual depictions and their digital forgeries, and it would require large, publicly accessible platforms (covered platforms) to create and operate a formal notice-and-removal process for such depictions. The bill defines key terms (consent, identifiable individual, intimate visual depictions, digital forgery, minor, interactive computer service) and creates separate offenses for authentic depictions and digital forgeries, with adults and minors treated somewhat differently. It also gives the Federal Trade Commission authority to enforce the removal process as an unfair or deceptive practice, while providing platforms with liability limitations for good-faith removals. A one-year deadline is set for platforms to establish the removal process, and a 48-hour window to remove material after a valid removal request is received. In short, the bill is aimed at cracking down on nonconsensual, sexually explicit content and deepfake-style images by criminalizing publication, mandating platform takedowns, and empowering the FTC to enforce compliance.

Key Points

  • 1New criminal offenses (under the Communications Act) for intentional disclosure of nonconsensual intimate visual depictions and for digital forgeries, with separate provisions for adults and minors and specific harm-based criteria.
  • 2Penalties: up to 2 years in prison for offenses involving adults; up to 3 years for offenses involving minors; plus fines. The bill also creates forfeiture and restitution provisions related to violations.
  • 3Platform duty to act: within one year, covered platforms must establish a process allowing identifiable individuals (or their authorized representatives) to notify the platform of nonconsensual depictions and request removal, with signed requests and sufficient information to locate the depictions.
  • 4Removal timeline and scope: once a valid removal request is received, platforms must remove the depiction within 48 hours and attempt to remove known identical copies.
  • 5Liability and protections: platforms are not liable for good-faith takedown actions based on information that indicates unlawful publication; specific limitations on liability are included.
  • 6FTC enforcement: violations of the notice-and-removal requirements would be treated as unfair or deceptive practices; the FTC would enforce the provisions similarly to other consumer protection laws, with authority to apply penalties and remedies, including for non-profit organizations.
  • 7Definitions and scope: defines “covered platform” as a public-facing website or app that primarily hosts user-generated content (with several exclusions, such as broadband providers and certain non-user-generated or preselected content sites).

Impact Areas

Primary group/area affected- Individuals depicted in nonconsensual intimate imagery (adults and minors) and their rights and safety, including protection against harm and avenues for removal.- Victims of intimate image abuse and those harmed by deepfake-like depictions.Secondary group/area affected- Covered platforms (social media, video/image sharing services, forums) that host user-generated content; they would need to implement the removal process, verify requests, and manage takedowns within 48 hours.- Platform compliance teams and legal/privacy teams responsible for implementing the notice-and-remove system and for handling requests and potential disputes.Additional impacts- Law enforcement and legal professionals, who interact with allowed exemptions and disclosures in the bill (e.g., investigations, legal proceedings, medical or educational purposes).- The tech and digital rights communities, which may analyze implications for free expression, privacy, and platform governance.- Public health and consumer protection: strengthening responses to online exploitation and abusive content while raising questions about due process, appeal rights, and potential over- or under-enforcement.- Potential costs or barriers for smaller platforms to implement the required process, given the one-year deadline and 48-hour removal requirement.Covered platform: a public-facing site or app that hosts user-generated content, or one that regularly publishes content created by users, which includes nonconsensual intimate depictions.Nonconsensual intimate depiction: a sexually explicit or intimate image/depic­tion published without the subject’s consent.Digital forgery: AI- or tech-generated imagery that imitates an actual person and could be mistaken for an authentic depiction.Identifiable individual: someone who appears in the image and has distinguishing features or a recognizable face.Consent: affirmative, voluntary permission free from coercion.Exemptions: certain contexts (e.g., law enforcement, legal proceedings, medical education) and content involving prohibited material like child pornography remain governed by other laws.
Generated by gpt-5-nano on Oct 7, 2025