KIDS Act

Mar 3, 2026
Mar 3, 2026

Full breakdown available

This pages provides a high-level overview of this bill. For full list of provisions, line-item appropriations, and specific funding allocations, please view our detailed breakdown.

Summary

Requires online platforms, games, and chatbots to add safety features, parental controls, and age-checks to protect minors from harmful content and interactions.

What problem does this solve?

Children and teens face many dangers online, like harmful content, addictive features, and contact from strangers, without enough protection. This bill makes online companies responsible for creating safer spaces for minors by requiring parental controls, age-checks, and limits on dangerous features.

What does this bill do?

Requires age-checks for harmful content
Makes websites where over a third of the content is sexually harmful to minors use technology to verify a user's age and block access for kids.
Creates new safeguards and parental tools
Requires online platforms to give parents tools to manage their child's privacy settings, screen time, and purchases, and to limit who can contact them.
Bans disappearing messages for minors
Prohibits online platforms from offering features like disappearing or 'ephemeral' messages to users under 17.
Stops market research on kids
Forbids online platforms from conducting market or product-focused research on users they know are minors, with some exceptions for safety improvements.
Adds safety controls to online games
Requires online video game providers to give parents safeguards to limit who their child can communicate with in the game.
Regulates AI chatbots for minors
Requires AI chatbots to tell minor users they are not human and to provide suicide prevention resources if a user discusses self-harm.
Mandates independent audits
Forces online platforms to undergo annual third-party audits to check their compliance with child safety rules and report on metrics like time spent by minors.
Establishes a Kids Internet Safety Partnership
Creates a new partnership led by the Secretary of Commerce to research and publish best practices for keeping kids safe online.

Who does this affect?

  • Minors (individuals under 17)
  • Parents and legal guardians
  • Online platform and technology companies

What is the real world impact?

Free speech and privacy concerns
Requiring age verification could lead to collecting more personal data on all users, not just kids. Critics might argue that broad content restrictions could limit free speech and access to information for both minors and adults.
Protecting children online
Aims to create a safer internet for minors by holding online platforms accountable for the content and features they provide, giving parents more control and reducing exposure to harmful material.
Burdens on tech companies
The extensive requirements for audits, new features, and content moderation could be costly and technically difficult for smaller platforms to implement, potentially favoring larger, established companies.

When does this start?

This bill has multiple start dates for its different sections, generally taking effect one year after it becomes law.
General effective date
Most provisions of the Act will take effect one year after the date it is signed into law.
Age-checks for harmful content
Platforms with sexually harmful material must implement age verification measures within one year of the bill becoming law.
First platform audit
Online platforms must conduct their first independent, third-party audit no later than 18 months after the bill becomes law.
Safe messaging rules
The rules for safe messaging features for kids will take effect 180 days after the bill becomes law.
Ban on market research
The prohibition on conducting market research on minors takes effect 90 days after the bill becomes law.