This page contains a categorized breakdown of provisions within [H.R. 7757] KIDS Act. For a high-level summary and broader context, please visit the overview page here.
Crime and Safety
Age verification for platforms with harmful sexual content
Requires platforms where over one-third of content is sexual material harmful to minors to use technology to verify user age and prevent minors from accessing it.
Platform responsibility to address harms to minors
Mandates that online platforms establish and enforce reasonable policies to address harms to minors, including severe threats, sexual exploitation, and the promotion of illegal substances.
Prohibition on ephemeral messaging for minors
Prohibits online platforms from offering or enabling any ephemeral (disappearing) messaging features to users known to be minors.
Prohibition on direct messaging for children under 13
Prohibits online platforms from offering or enabling any direct messaging features to users known to be under the age of 13.
Safeguards for minors on social gaming platforms
Requires online video game providers to give parents tools to limit communication for minor users, with the most protective settings enabled by default, and to restrict purchases and game time.
Report on fentanyl access on social media
Requires a federal report on how minors access fentanyl via social media, the impact of this access, and recommendations for Congress to eliminate the problem.
Economy and Commerce
Prohibition on advertising harmful products to minors
Prohibits online platforms from facilitating the advertising of narcotic drugs, cannabis, tobacco products, gambling, or alcohol to users known to be minors.
Prohibition on market research on minors
Prohibits covered platforms from conducting market or product-focused research on users known to be minors, unless it's to improve platform safety or for legal compliance.
Education and Research
Study on social media's impact on minors
Directs the FTC and HHS to study the effects of social media use on minors, covering data collection, algorithms, advertising, and mental health, and to report findings to Congress.
National online safety awareness campaign
Directs the FTC to establish a national program to promote safe internet use by minors through public awareness campaigns, educational outreach, and the promotion of best practices.
Government Operations
GAO report on technology verification measures
Requires the Comptroller General to submit a report to Congress on the effectiveness, compliance, and societal effects of the required technology verification measures within three years.
Annual audits of online platforms
Requires covered online platforms to undergo an annual independent, third-party audit to assess risks to minors and compliance with safety requirements, with results reported to the FTC and the public.
Preemption of state laws on messaging for minors
Prohibits states from making laws that ban ephemeral or direct messaging for users under 13 or regulate parental controls for teen users on covered platforms.
Kids Internet Safety Partnership
Establishes a partnership to identify online risks and benefits for minors and publish a playbook of best practices for websites and apps on topics like age verification and parental tools.
Enforcement by FTC and state attorneys general
Authorizes the Federal Trade Commission and state attorneys general to enforce the Act's provisions, treating violations as unfair or deceptive practices.
Health
Regulation of AI chatbots for minors
Prohibits AI chatbots from falsely claiming to be licensed professionals and requires them to disclose their AI nature and provide suicide prevention resources to minor users when prompted.
Study on chatbot impact on minor's mental health
Mandates the National Institutes of Health to conduct a four-year study evaluating the risks and benefits of chatbots on the mental health of minors.
Social services
Mandatory safeguards and parental controls for minors
Requires online platforms to provide minors with safeguards like limiting communications and compulsive design features, and to offer parents tools to manage privacy, purchases, and time spent online.
Parental controls for teen direct messaging
Requires platforms to provide parents of teen users (ages 13-16) with controls to manage direct messaging, including approving contacts and disabling the feature.

