KOSA’s Constitutionality Concerns:

Do KOSPA’s Edits Fix the Issues?

December 12, 2024

Jessica Arciniega, Katherine Kalpos, Morgan Sexton, and Amelia Vance

 

 

 

CC BY-NC 4.0

As the 118th Congress nears its end, PIPC has been closely tracking the likelihood of federal child privacy protections becoming law. As you may recall, the Senate passed the Kids Online Safety and Privacy Act (KOSPA) in July, incorporating two major child privacy bills–the Kids Online Safety Act (KOSA) and the Children and Teen’s Online Privacy Protection Act (COPPA 2.0). The House Energy & Commerce Committee passed versions of KOSA and COPPA 2.0 in September that were very different than what passed the Senate, opening questions about whether these bills could become law before the next Congressional session begins in January 2025. On December 7th, Senator Blackburn released a new version of KOSPA which attempts to resolve many of the concerns about KOSA that have held up this bill. While it is unlikely KOSPA will move by the end of the year, we anticipate renewed debate over the bill early next year. 

KOSPA would meaningfully advance and revise child privacy protections. For example, COPPA 2.0 would increase the age of individuals entitled to foundational privacy protections under COPPA from under 13 to under 17. Additionally, KOSA would implement more safeguards to protect children online and expand society’s current understanding of how social media is impacting minors, both through conducting crucial research and by creating the expert Kids Online Safety Council. However, KOSA in particular has faced significant critiques, including many concerns about how the bill could violate the First Amendment by, among other issues, leading to companies overly censoring speech in order to avoid violating the law. 

There were significant changes to KOSA’s Duty of Care in the new draft from Senator Blackburn. Here is our redline comparing this section of KOSA (as passed by the Senate in a prior version of KOSPA in July) to the new KOSPA language:

SEC. 102. DUTY OF CARE.

(a) Prevention of Harm to Minors.–A covered platform shall exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate the following harms to minors where a reasonable and prudent person would agree that such harms were reasonably foreseeable by the covered platform and would agree that the design feature is a contributing factor to such harms:

(1) Consistent with evidence-informed medical information, the following mental health disorders: anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors.

(2) Depressive disorders and anxiety disorders when such conditions have objectively verifiable and clinically diagnosable symptoms and are related to compulsive usage.

(2 3) Patterns of use that indicate or encourage addiction-like behaviors by minors. compulsive usage.

(3 4) Physical violence, online bullying, and harassment of the minor or online harassment activity that is so severe, pervasive, or objectively offensive that it impacts a major life activity of a minor.

(4 5) Sexual exploitation and abuse of minors.

(5 6) Promotion and marketing Distribution, sale, or use of narcotic drugs, (as defined in section 102 of the Controlled Substances Act (21 U.S.C. 802)), tobacco products, cannabis products, gambling, or alcohol.

(6 7) Predatory, unfair, or deceptive marketing practices, or other financial harms caused by unfair or deceptive acts or practices (as defined in section 5(a)(4) of the Federal Trade Commission Act (15 U.S.C. 45(a)(4)).

(b) Limitation RULES OF CONSTRUCTION.--

(1) Nothing in subsection (a) shall be construed to require a covered platform to prevent or preclude any minor from–

(1 A) deliberately and independently searching for, or specifically requesting, content; or

(2 B) accessing resources and information regarding the prevention or mitigation of the harms described in subsection (a).

(2) Nothing in this section shall be construed to allow a government entity to enforce subsection (a) based upon the viewpoint of users expressed by or through any speech, expression, or information protected by the First Amendment to the Constitution of the United States.

KOSA’s duty of care requires platforms to “exercise reasonable care in the creation and implementation of any design feature” to prevent and mitigate certain harms to minors. Concerns about KOSA’s potential to enable broad online censorship have been a key obstacle to its passage, with two primary arguments driving the debate.

Argument 1: The types of harms companies must protect minors from are overly broad and ill-defined.

Beyond the difficulty of determining exactly how to prevent and mitigate listed harms, there is also concern that government officials charged with enforcing KOSA would have different interpretations of what causes specific mental health harms depending on their political leaning. Opponents fear that–rather than risk enforcement from regulators on either side of the political spectrum–platforms will preemptively censor any content that could be considered objectionable from either side, leaving minors without access to important information.

KOSPA’s Changes: 

  • Narrowing the types of harms – KOSPA narrowed the types of harms that platforms must consider in using design features to provide more certainty for platforms. The most notable change is the requirement that depressive disorders and anxiety disorders must now be “related to compulsive usage,” focusing on the harms that can come from using the platform extensively (regardless of the type of content that is being displayed).
  • Adding a more concrete standard to when platforms must mitigate harms – KOSPA specifies that “a reasonable and prudent person” must agree that harms were “reasonably foreseeable” by the covered platform, helping to clarify when companies must protect minors from the harms identified and protect against the risk of subjective enforcement actions.
19

KOSPA’s changes may have limited impact in addressing opponents’ concerns. For example, the narrowing of the harms only partially addresses the concern raised by critics because companies still have a duty to protect minors from a series of amorphous harms that are closely related to content, such as eating disorders, substance use disorders, and suicidal behaviorsAnd while the addition of “reasonable and prudent person” and “reasonably foreseeable” appear aimed at reducing the fear of subjective decision making, such language may only have a limited role.

Argument 2: KOSA’s duty of care requires platforms to protect children from harmful content, which is likely facially unconstitutional.

While KOSA is intended to place obligations on companies with regard to their use of design features, opponents argue that it indirectly requires companies to restrict the content shown to minors. This is in part because KOSA’s duty of care has regulated the use of personalized recommendation systems as design features. Additionally,  the requirement to prevent and mitigate harms related to certain types of speech, such as speech containing online harassment, has been criticized for attempting to restrict minors’ access to harmful, though constitutionally protected speech. 

2024-04-04 700x510 (27)

KOSPA’s Changes: 

  • Removing “personalized recommendation systems” from the definition of “design feature” – KOSPA replaced “personalized recommendation system” with “personalized design feature” in the definition of “design feature,” moving platform responsibilities away from regulating algorithms promoting content.
  • Design features must be a “contributing factor” to the harm – KOSA’s duty of care now clarifies that the harms to be considered must be connected to the design feature at issue. This places the emphasis on how the specific design is contributing to harming the minor, rather than just the minor being exposed to harmful content (because that content is generally constitutionally protected speech that, in most cases, cannot be subjected to government censorship).

As raised in this article from Tech Policy Press: “while content algorithms directly implicate First Amendment-protected speech, design features, properly understood, do not.” KOSPA’s change partially addressed this issue by replacing the term “personalized recommendation system” with “personalized design feature.” However, “personalized design feature” is defined as an automated system, “including a recommendation system” (emphasis added). Because this new term could be read to encompass personalized recommendation systems, it is not yet clear whether this change completely addresses the concerns with regulating personalized recommendation systems as design features. 

Closing Thoughts

While we are in no way constitutional law experts, we think these changes are steps in the right direction towards addressing existing concerns that KOSA may violate the First Amendment rights of individuals and platforms through the over-censoring of speech online. We will be paying close attention to see what constitutional law scholars say when they assess whether the changes in KOSPA sufficiently address First Amendment concerns in KOSA.