Kids Online Safety Act Returns

June 2, 2025

Jessica Arciniega, Katherine Kalpos, Morgan Sexton, and Amelia Vance

 

 

 

CC BY-NC 4.0

The fight for a safer online experience for kids is back on the table. On May 15th, Senators Blackburn and Blumenthal reintroduced the Kids Online Safety Act (KOSA), bringing back the exact same proposal from December 2024. We've been down this road before, meticulously tracking every twist and turn. But this year, it's time for a clean slate. This blog cuts through the noise to reveal KOSA's core components and their potential impact on minors, parents, and the very platforms they use. 

Recap: What is KOSA?

KOSA requires covered platforms to design their products in a way that (1) protects minors from harm; and (2) gives parents and minors additional controls over the minor’s  online experiences. 

Scope 

Title I of KOSA applies to “covered platforms” that are used or “reasonably likely to be used” by minors. This includes websites or applications that “predominantly provides a community forum for user generated content.”

Additionally, many of KOSA’s provisions apply only where the covered platform “knows” that a user is a minor. A covered platform has knowledge if it has “actual knowledge or knowledge fairly implied on the basis of objective circumstances.” 

Title II of KOSA, which establishes “Filter Bubble Transparency” requirements, is notably broader than the rest of KOSA. Title II focuses on algorithmic transparency and establishes new rules for most websites and apps that use “opaque algorithms'' to order content. Covered platforms that use opaque algorithms must:

  • Develop a version of their interface that displays content in chronological order without algorithmic manipulation;
  • Create accessible mechanisms for users to switch between algorithmic and non-algorithmic content display; and
  • Ensure that non-algorithmic options maintain full functionality of the platform.
2023-12-12 Canva Adapted Image (59)

Key Provisions of KOSA Title I

Duty of Care

A duty of care is at the heart of KOSA. Covered platforms must “exercise reasonable care” when implementing “design features” to prevent and mitigate foreseeable harms to minors. These harms include: 

  • Mental health harms including eating disorders, substance use disorders, and suicidal behaviors, as well as depression and anxiety disorders that have “objectively verifiable and clinically diagnosable symptoms;”
  • Compulsive use patterns;
  • Online harassment and violence;
  • Sexual exploitation and abuse;
  • Drug, alcohol, or gambling use; and
  • Financial harms from unfair or deceptive acts or practices.

The duty of care provision has undergone countless revisions in efforts to mitigate constitutional concerns while still protecting minors.* The current KOSA language retains the duty of care first proposed in December, which specifies that “a reasonable and prudent person” must agree that harms were “reasonably foreseeable” by the covered platform and that the design feature contributed to the harms. 

KOSA’s duty of care pertains to ”design features.” This is defined as features that “encourage or increase the frequency, time spent, or activity of minors on the covered platform,” including:

  • Infinite scrolling or auto play;
  • Rewards or incentives based on the frequency, time spent, or activity of minors on the covered platform;
  • Notifications and push alerts;
  • Badges or other visual award symbols based on the frequency, time spent, or activity of minors on the covered platform;
  • Personalized design features;
  • In-game purchases; or
  • Appearance altering filters

*As a reminder: Last year, constitutionality concerns about KOSA’s potential to enable broad online censorship were a key obstacle to its passage. There were two primary arguments driving the debate:

  • Argument 1: The types of harms companies must protect minors from were overly broad and ill-defined.
  • Argument 2: KOSA’s duty of care required platforms to protect children from harmful content, which was likely facially unconstitutional.

For more information on how the latest version of KOSA has been adapted in response to First Amendment concerns, see PIPC’s blog post.

Tools and Safeguards

KOSA requires covered platforms to offer certain tools and safeguards to minors, giving them greater control over their online experiences. While the same protections apply regardless of whether a user is a child or teen, the parental tools differ slightly when users are children. Both minors and parents are provided the following control features:

  • Limit contact and data sharing (control who can reach minors and view their data);
  • Restrict addictive design patterns (limit autoplay and infinite scroll);
  • Opt out of personalized content recommendations;
  • Restrict geolocation sharing; and 
  • Set time limits for platform use.

These safeguards must be configured to offer the most protective level of control by default, meaning covered platforms must maximize privacy and safety from the outset rather than requiring users to opt into such protections.

Covered platforms must provide parents with additional control features to:

  • Restrict purchases;
  • Limit usage; and
  • View privacy settings
    • If the user is a child, parents must be able to modify privacy settings.

Special Circumstance: Schools

When a covered platform is acting on behalf of the school pursuant to a contract that complies with the Children’s Online Privacy Protection Act (COPPA) and the Family Educational Rights and Privacy Act (FERPA), they must offer the tools and safeguards to the school (rather than to the user or their parents). Including this provision mitigates previous concerns that parental controls may offer less protections than privacy settings negotiated by schools and clarifies that schools, rather than parents, bear the burden of controlling platform privacy settings for educational services.

Transparency

KOSA requires covered platforms to provide additional transparency measures to parents, minors, and the public more generally. Covered platforms must disclose:

  • Clear information on safeguards that takes into consideration differing ages, capacities, and developmental needs of minors; and
  • Details about personalized recommendation systems, including how systems use personal data and how minors/parents can opt out.

Very large platforms—those with over 10 million active monthly users—face additional requirements regarding transparency. These covered platforms must issue public reports based on independent third-party audits, detailing usage metrics and measures. Reports must also include:

  • The measures used to mitigate identified risks; and 
  • Processes used to create design features.

Research 

KOSA places the following restrictions on how covered platforms can conduct research involving minors:

  • Research on children is entirely prohibited; and
  • Research on minors requires verifiable parental consent (although it is not clear how that will work given that minors over 13 are not protected by COPPA).

Beyond research restrictions, KOSA also mandates certain research and advice. Specifically, the FTC must conduct a study on age verification methods and their impact on privacy. Additionally, the bill also establishes the Kids Online Safety Council, which brings together diverse stakeholders—including researchers, parents, educators, and industry representatives—to advise on emerging risks and best practices. 

Preemption

KOSA does not preempt state laws with stronger protections.

Enforcement

KOSA provides both the Federal Trade Commission (FTC) and state attorneys general with authority to enforce the law. 

Moving Forward

We are going to keep tracking this bill as it moves through Congress and analyzing changes as they develop. Stay tuned for additional updates!