House Improves KOSA, but Major Problems Persist for Schools
May 2024
Katherine Kalpos, Morgan Sexton, and Amelia Vance
CC BY-NC 4.0
Introduction
On April 9, 2024, Representative Bilirakis released the text of the House version of Senator Blumenthal’s Kids Online Safety Act (KOSA). The bill, which is fundamentally based on valuable goals and principles, includes many positive revisions to better align with KOSA's underlying goal to protect kids online. However, concerns remain that it may create major unintended consequences for schools. In particular, KOSA may limit the ability of schools to effectively implement edtech into instruction and allow students to circumvent adaptive learning experiences. This blog aims to cover both sides of KOSA, highlighting its accomplishments while also pinpointing its problem areas and exploring ways to fix them.
Background: For a brief overview of the requirements in KOSA, check out this summary on Congress.gov.
Celebrating KOSA's Positives
Multiple promising elements in the House version of KOSA have the potential to positively impact the way children experience the internet.
1. KOSA is based on laudable goals that should be embedded into US privacy law.
KOSA aims to bolster privacy protections for children–a commendable goal that should be incorporated into federal privacy legislation.
Children are Uniquely Susceptible to Harms Online
Negative online experiences may impact children differently than other populations due to their unique vulnerabilities and the potential for long-term consequences that come with privacy violations. Children are at a stage in life where they are still developing their sense of self and understanding of the world. As such, their online experiences can greatly impact their mental health as well as their futures. For example, since technology may create a permanent record of childhood mistakes on the internet, children may face challenges and obstacles in their personal and professional lives later on. For more details on how children are uniquely vulnerable to privacy harms, check out my prior written testimony.
KOSA acknowledges and addresses this reality by implementing specific safeguards by design and by default to protect children while they are using the internet. Ideally, such measures will help ensure the safety and well-being of children in online environments, protecting them from these harms online.
Additional Safeguards for Children
Children need more protections online, and KOSA would help turn this idea into reality. However, while KOSA’s dedication to ensuring that there are additional safeguards by default for children online is admirable, KOSA alone will not make the internet a safer place for children. To better protect children’s privacy and safety online, there first needs to be baseline protections for everyone. Protections for children in the UK’s Age-Appropriate Design Code are built upon foundational privacy protections for everyone under the General Data Protection Regulation, a comprehensive framework that the US lacks. Something similar should happen in the US.
Rather than moving forward on its own, KOSA should accompany comprehensive privacy protections for everyone to create additional safeguards for children. This approach would address current gaps in sectoral privacy laws, ensuring a consistent baseline of protection for all individuals and eliminating concerns about children outgrowing privacy rights. Our ideal outcome would be for Congress to pass comprehensive federal privacy legislation (such as those in the American Privacy Rights Act discussion draft) that directly incorporates–or is supplemented by–additional protections for children (including certain safeguards in KOSA, the Children and Teens’ Privacy Protection Act (COPPA 2.0), and California's Age-Appropriate Design Code).
2. The House version uses more precise terms and definitions to better align with KOSA’s intended scope.
The House version of KOSA tailors its scope by utilizing precise terminology and definitions that reflect the bill's original intent, ultimately enhancing the overall effectiveness of the bill. The following provides examples of these definitional changes, along with explanations of why they are steps in the right direction.
Defining “High Impact Online Company” & New Knowledge Standards
The Senate version of KOSA used the same knowledge standard for all covered entities, “actual knowledge or knowledge fairly implied on the basis of objective circumstances.” (§(2)(6)).
The House version introduces more nuance to the analysis, using different knowledge standards based on who the entity is (see table, based on 101(6)). Using different knowledge standards based on an entity's size and activities acknowledges that larger platforms have greater capacity to actively analyze their user base and holds them accountable for doing so. This approach also considers the limitations faced by smaller platforms with less expertise and resources, ensuring that they are not unfairly burdened.
Entity | Knowledge Standard |
---|---|
High impact online company | Knew or should have known |
Not a high impact online company, but meets certain revenue and processing thresholds | Knew or acted in willful disregard |
Everyone else | Actual knowledge |
Adding Exclusions to “Personalized Recommendation System”
The House version added the following exclusions to the definition of personalized recommendation system:
(ii) technical means that do not fully automate or replace human decision-making processes;
(iii) technical means that are designed to block, detect, identify, or prevent a user from accessing inappropriate, unlawful, or harmful content; or
(iv) technical means designed to prevent or detect fraud, malicious conduct or other illegal activity, or preserve the integrity or security of systems, products, or services.
§101(14)(B). These exclusions are important to ensure that platforms are not discouraged from taking proactive steps to protect children by keeping a human in the loop, engaging in age-appropriate content moderation, or safeguarding their systems against illegal or malicious activity.
Narrowing "Compulsive Usage"
The Senate version of KOSA defines compulsive usage as “any response stimulated by external factors that causes an individual to engage in repetitive behavior reasonably likely to cause psychological distress, loss of control, anxiety, or depression.” (§(2)(2), emphasis added). This definition is likely too broad because it would be nearly impossible for covered platforms to accurately predict what will cause unique individuals to experience psychological distress.
In the House version, the definition of compulsive usage has been narrowed to “any response stimulated by external factors that causes an individual to engage in repetitive behavior reasonably likely to cause a mental health disorder.” (§101(2), emphasis added). As noted in an article by Tech Policy Press:
This refers to a clinical definition by the American Psychiatric Association compared to the Senate version of the bill which broadly refers to “psychological distress.”
This update may also appeal to LGBTQ advocates who are concerned the bill is overly broad and could be used to limit content related to gender identity. Updated language also removes “addiction” references, replacing them with the clinical term, “compulsive usage.” One of the House co-sponsors, Rep. Kim Schrier (D-WA), is a pediatrician and the APA is actively engaged with the legislation.
While we appreciate that the House is attempting to narrow the definition, we caution that changing this to “mental health disorder” may narrow it too much. Instead, specifying that this term applies when the “repetitive behavior reasonably likely to cause psychological distress, loss of control, anxiety, or depression in a significant number of children or minors” (suggested language bolded) can assist in ensuring covered platforms do not believe that this applies in situations where nearly all individuals do not experience harm but there are a couple individuals uniquely predisposed to psychological distress or repetitive actions that are experiencing harm. In order to avoid covered platforms using this language as some sort of loophole, the bill could also add language noting that the FTC can offer guidance on this specific point.
Adding Exclusions to “Online Platform”
In the House version, an exclusion to the definition of online platform was added for “chats, comments, or other interactive functionalities of the community forum that is incidental to the predominant purpose of the website, online service, online application, or mobile application.” (§101(10)(C)). In doing so, the definition of online platforms is refined to specifically target sites that primarily offer interactive features. This crucial revision aims to limit the scope of the definition to social media platforms, as originally intended, and helps avoid inadvertently covering a vast majority of the internet.
Note: While this definitional change is a step in the right direction, there are still concerns about KOSA’s broad definition of the term “covered platform” and how it may impact collaborative edtech platforms. These concerns are detailed below.
Adding Types of “Design Features”
In the House version, the definition of design feature has been expanded to also encompass “push alerts that urge a user to spend more time engaged with the platform when they are not actively using it” and “badges or other visual award symbols based on elevated levels of engagement with the platform.” (§101(4)(D) and (E)). This addition helps ensure that platforms cannot use design techniques that are particularly enticing for children to keep them online and engaged.
Room for Improvement
Despite these beneficial changes, KOSA remains flawed–but easily fixable. As currently written in both the House and Senate versions, KOSA may have major unintended consequences for schools.* Below are our top two concerns along with potential solutions and key context on why implementing KOSA without addressing these issues could have unintended consequences for schools. These two issues would need to be addressed before we could change our current position opposing the bill.
*While these concerns apply to both the House and Senate versions of KOSA, all citations in the following analysis will refer to the House version of KOSA in an effort to avoid confusion.
1. KOSA may limit the ability of schools to effectively implement edtech in instruction and to establish privacy-protective safeguards for its use.
How this Comes into Play
KOSA’s definition of “Covered Platform” includes an “online video game” (§101(3)(A)); and the definition of “Online Video Game” includes “an educational video game” (§101(11), emphasis added). This will likely lead to confusion regarding KOSA's application to edtech vendors—who are already subject to contractual restrictions. The inclusion of “educational video game” may cause particular issues in light of the “Parental Tools” section which provides that parents shall have the ability to “manage a minor’s account settings” (§101(11)).
Implementation Challenges
Allowing parents to manage account settings on edtech platforms their child uses at school may throw the educational system into chaos–overriding established privacy protections in the school’s contracts with edtech vendors and leading to significant data loss. For example, account settings generally include the ability to expand data collection, close an account, and delete data. Giving parents the ability to manage such account settings on edtech used in school may allow parents to consent to additional data collection beyond the limitations set by the contract between the school and edtech provider, delete their child’s progress and learning milestones, or even delete their child’s homework. Such significant changes would undermine the school’s ability to establish privacy-protective safeguards for edtech products, and may also hinder their ability to consent to the use of edtech in the first place.
Existing Exceptions Don’t Fix this Issue
Unfortunately, the exception for schools being considered “covered platforms” under KOSA does not sufficiently address this concern. The exception for schools states:
The term “covered platform” does not include—...(iii) any public or private preschool, elementary, or secondary school, or any institution of vocational, professional, or higher education.
101(3)(B)(iii). This exception does not include a carve-out for service providers or other third parties contracting with/for educational agencies and institutions. By including “an educational video game” in the definition of “Online Video Game,” it is very likely that this law will be interpreted to regulate third party edtech vendors whose products are being used in (as opposed to operated by) schools, and therefore indirectly will regulate and cause problems for schools.
Additionally, KOSA’s current rules of construction saying that the law should not be interpreted to preempt the Family Educational Rights and Privacy Act (FERPA) or the Children’s Online Privacy Protection Act (COPPA) (see §113(a)(1) and (2)) do not fix the issue because this interpretation would not directly contradict either law.
Suggested Solution: Clearly delineate between technologies used at school and other technology uses.
Add the following paragraph to §103(b):
(6) APPLICATION TO EDUCATION.–The parental tools provided to parents in Section 103 should be provided by a Covered Platform to educational agencies or institutions when the Covered Platform is acting on behalf of an educational agency or institution pursuant to a written contract that complies with the requirements of the Children’s Online Privacy Protection Act and the Family Educational Rights and Privacy Act.
2. KOSA may allow students to circumvent adaptive learning experiences that are part of the core curriculum.
How this Comes into Play
The scope of Title II, which establishes “Filter Bubble Transparency” requirements, is notably broader than the rest of KOSA. The Title’s broad definition of "Online Platform" may encompass edtech that is used pursuant to a written contract with the edtech provider under FERPA and COPPA.
What is Required
Title II focuses on algorithmic transparency and establishes new rules for most websites and apps that use “opaque algorithms'' (§201(7)) to order content. Under Title II, an “Online Platform” (§201(6)) utilizing opaque algorithms in automated content selection or ranking decisions would need to: (1) inform users that opaque algorithms are being used, (2) provide an alternative option that only uses “input-transparent algorithms”, and (3) let users easily switch between the two options. (see §202 for more details).
Opaque Algorithms (§201(7))
Algorithms that use personal data that was not provided by the user when using the platform to select or rank content. Opaque algorithms may filter content based on information acquired from other sources or inferences about the user.
Input-Transparent Algorithms (§201(5))
Either (1) do not use user-specific data in the content selection or ranking process; or (2) only uses data provided by the user when using the platform in the content selection or ranking process. Transparent algorithms may filter content based on specific search terms entered by the user or display results chronologically.
Suggested Solution: Exclude technologies used at school from the Filter Bubble Transparency requirements (Title II).
Add an exception that requires operators to provide the safeguards described in §202(b) to the educational agencies or institutions, rather than to the users directly, when operators are “acting on behalf of an educational agency or institution subject to a written contract that complies with the requirements of the Children’s Online Privacy Protection Act and the Family Educational Rights and Privacy Act.”
Add an exception that requires operators to provide the safeguard described in §103(a)(1)(C) to the educational agencies or institutions, rather than to the minor directly, when operators are “acting on behalf of an educational agency or institution subject to a written contract that complies with the requirements of the Children’s Online Privacy Protection Act and the Family Educational Rights and Privacy Act.”
Closing Thoughts
There must be a continuing conversation of how KOSA may impact schools. While the House version of KOSA makes important revisions that better align with KOSA’s intended scope, provisions that may have unintended consequences for schools remain. In order for schools to continue offering students a technology-enhanced education, KOSA must distinguish between technologies used for educational purposes and other purposes, as well as exclude certain edtech products from the Filter Bubble Transparency requirements. This does not need to create giant loopholes: both of these clarifications can be limited to when there is a privacy-protective written contract in place between the school and technology provider. KOSA can be tailored to protect children online while still allowing for effective implementation of edtech in classrooms, but to do that there must be collaboration with education stakeholders and consideration of their concerns.
For more information on our concerns and to see other ways KOSA can be improved, check out our detailed KOSA comments here.
Pingback: Comparing Provisions in KOSMA and KOSA – Public Interest Privacy Center