CHAT Act
The CHAT Act would require providers of companion AI chatbots to implement strict age-verification and minor-protection measures. Covered entities would have to create user accounts, verify ages (for both existing and new accounts), and classify users as minors or adults. If a user is determined to be a minor, the entity must link the minor’s account to a verified parental account, obtain verifiable parental consent before the minor can use the chatbot, inform the parent about interactions involving suicidal ideation, and block access to any chatbot that engages in sexually explicit communication. The act also requires ongoing safety monitoring for suicidal thoughts, provides resources to users and parents, and mandates a clear AI-not-human disclosure at the start of each interaction and at least every 60 minutes. Data from age verification must be protected, and a safe-harbor provision shields entities that act in good faith and follow industry standards. The FTC would enforce the act, with state attorneys general able to bring civil actions as well. The act becomes effective one year after enactment.