Kids Online Safety Act
The Kids Online Safety Act would require large online platforms that are used by minors to adopt strong safety measures designed to reduce harm and curb problematic online use, especially for younger users. The bill sets up a framework of duties for platforms (designing features, giving tools to parents, and providing disclosures) aimed at limiting compulsive use, safeguarding personal data, and reducing exposure to harmful content and advertising. It also introduces transparency requirements, an independent auditing process, and studies on age verification, while creating a separate track (Filter Bubble Transparency) to address how platforms may influence what content users see. Notably, the law distinguishes between “child” (under 13) and “minor” (under 17) for purposes of certain protections, and it carves out a number of exemptions (schools, libraries, government services, and certain communications services). If enacted, the measure would significantly shape how platforms interact with younger users, potentially increasing safety controls, parental oversight, and public reporting while also raising compliance costs and prompting design changes for both large and smaller platforms. The breadth of topics—ranging from default privacy protections and parental controls to age verification research and content transparency—could affect user experiences, business models, and cross-border data practices.