TAKE IT DOWN Act
The TAKE IT DOWN Act would add federal criminal penalties for the intentional publication of nonconsensual intimate visual depictions and their digital forgeries, and it would require large, publicly accessible platforms (covered platforms) to create and operate a formal notice-and-removal process for such depictions. The bill defines key terms (consent, identifiable individual, intimate visual depictions, digital forgery, minor, interactive computer service) and creates separate offenses for authentic depictions and digital forgeries, with adults and minors treated somewhat differently. It also gives the Federal Trade Commission authority to enforce the removal process as an unfair or deceptive practice, while providing platforms with liability limitations for good-faith removals. A one-year deadline is set for platforms to establish the removal process, and a 48-hour window to remove material after a valid removal request is received. In short, the bill is aimed at cracking down on nonconsensual, sexually explicit content and deepfake-style images by criminalizing publication, mandating platform takedowns, and empowering the FTC to enforce compliance.
Key Points
- 1New criminal offenses (under the Communications Act) for intentional disclosure of nonconsensual intimate visual depictions and for digital forgeries, with separate provisions for adults and minors and specific harm-based criteria.
- 2Penalties: up to 2 years in prison for offenses involving adults; up to 3 years for offenses involving minors; plus fines. The bill also creates forfeiture and restitution provisions related to violations.
- 3Platform duty to act: within one year, covered platforms must establish a process allowing identifiable individuals (or their authorized representatives) to notify the platform of nonconsensual depictions and request removal, with signed requests and sufficient information to locate the depictions.
- 4Removal timeline and scope: once a valid removal request is received, platforms must remove the depiction within 48 hours and attempt to remove known identical copies.
- 5Liability and protections: platforms are not liable for good-faith takedown actions based on information that indicates unlawful publication; specific limitations on liability are included.
- 6FTC enforcement: violations of the notice-and-removal requirements would be treated as unfair or deceptive practices; the FTC would enforce the provisions similarly to other consumer protection laws, with authority to apply penalties and remedies, including for non-profit organizations.
- 7Definitions and scope: defines “covered platform” as a public-facing website or app that primarily hosts user-generated content (with several exclusions, such as broadband providers and certain non-user-generated or preselected content sites).