Comparing Provisions in KOSMA and KOSA

The Kids Off Social Media Act (KOSMA) and the Kids Online Safety Act (KOSA) are progressing through Congress, both with the shared objective of protecting children online. KOSA attempts to achieve this goal with broad requirements that are aimed at making the platforms minors use safer. KOSMA, on the other hand, aims to protect children from social media in two main ways:

  1. Prohibiting minors under age 13 from creating or maintaining social media accounts
  2. Prohibiting social media companies from targeted content to to minors using algorithms
  3. Requiring schools to block and filter social media platforms.

KOSMA is a combination of Senator Schatz’s Protecting Kids on Social Media Act and Senator Cruz’s Eyes on the Board Act. Title I of KOSMA incorporates Senator Schatz’s bill and applies directly to social media companies. Title II of KOSMA incorporates Senator Cruz’s bill and applies to schools, rather than social media companies. Most of our comparisons focus on Title I because it, more than Title II, contains provisions that overlap with KOSA.


For more information on Title II, see our other blogs. And to learn more about KOSA, see this blog.

Below, we provide comparisons between some specific provisions in KOSMA and KOSA, illuminating the potential impacts and challenges of each.

Covered Platforms

One of the biggest differences between KOSA and KOSMA is the entities that are covered. KOSA applies to a much broader subset of entities that are used or “reasonably likely to be used by a minor.” A large swath of internet platforms would fall into this definition and would be required to comply with KOSA’s provisions. On the other hand, KOSMA covers only “social media platforms.” While the definition of social media company differs between Title I and Title II (see other blog), KOSMA covers a much narrower subset of internet platforms than KOSA by applying only to social media platforms. 

It is important to note that, because “social media platforms” are included in KOSA’s definition of “covered platforms,” any social media platform covered by KOSMA would also need to comply with KOSA’s provisions. 

KOSA (Senate 2/15) KOSMA Title I (5/1) KOSMA Title II (5/1)
Covered platform” means an online platform, online video game, messaging application, or video streaming service that connects to the internet and that is used, or is reasonably likely to be used, by a minor. ‘‘Social media platform’’ means a public-facing website, online service, online application, or mobile application that— (i) is directed to consumers; (ii) collects personal data; (iii) primarily derives revenue from advertising or the sale of personal data; and (iv) as its primary function provides a community forum for user-generated content, including messages, videos, and audio files among users where such content is primarily intended for viewing, resharing, or platform-enabled distributed social endorsement or comment Social media platform”— (i) means any website, online service, online application, or mobile application that— (I) serves the public; and (II) primarily provides a forum for users to communicate user-generated content, including messages, videos, images, and audio files, to other online users

Safeguards

KOSA and KOSMA take two different approaches to safeguarding minors online. KOSMA simply prohibits children (under 13) from creating or maintaining social media accounts and requires platforms to delete any account the platform knows belongs to a child. KOSA, in contrast, requires platforms to provide certain protections and controls to minor users.

KOSMA’s prohibition on children’s accounts conflicts with KOSA’s requirements to provide such children controls on those accounts. Social media platforms covered by both KOSA and KOSMA would be required to prohibit children from opening or maintaining accounts. Any of KOSA’s provisions that apply only to children (such as enabling parental tools by default) would not apply to social media platforms because children are not permitted to have accounts.

KOSA (Senate 2/15) KOSMA (5/1)
Minor and Parent tools:

(A) limit the ability of other users or visitors to communicate with the minor;

(B) prevent other users or visitors, whether registered or not, from viewing the minor’s personal data collected by or shared on the covered platform, in particular restricting public access to personal data;

(C) limit design features that encourage or increase the frequency, time spent, or activity of minors on the covered platform

(D) control personalized recommendation systems

(E) restrict the sharing of the geolocation of the minor and provide notice regarding the tracking of the minor’s geolocation.

Must provide users with the option to (a) delete the minor's account and delete any personal data collected from, or shared by, the minor on the covered platform; or (b) limit the amount of time spent by the minor on the covered platform.

No children under 13:

(A) a social media platform shall not permit an individual to create or maintain an account or profile if it knows that the individual is a child

(B) A social media platform shall terminate any existing account or profile of a user who the social media platform knows is a child.

(C) upon termination of an existing account, a social media platform shall immediately delete all personal data collected from the user

Parental tools:

(A) manage a minor’s privacy and account settings in a manner that allows parents to—

(i) view the privacy and account settings; and

(ii) change and control the privacy and account settings (for children);

(B) restrict purchases and financial transactions by the minor; and

(C) view metrics of total time spent on the covered platform and restrict time spent.

Access: a user of an existing account that has been terminated may request a copy of their personal data within 90 days of termination

Algorithms

Both KOSA and KOSMA place restrictions on the use of “personalized recommendation systems” and have nearly identical definitions – “a fully or partially automated system used to suggest, promote, or rank content, including other users, hashtags, or posts, based on the personal data of users.”

Where these bills diverge is the restrictions they place on personalized recommendation systems. KOSMA outright prohibits social media platforms from using minor’s personal information in a personalized recommendation system to display content. KOSA takes a different approach, requiring covered platforms to “provide tools to control personalized recommendation systems.” Minors must have the option to “opt-out of personalized recommendation systems” or “limit types or categories of recommendations.” 

These provisions would create conflicting mandates, where a social media platform would be simultaneously prohibited from using personalized recommendation systems and required to provide controls for such systems.

KOSA (Senate 2/15) KOSMA (5/1)
Definition Personalized recommendation system” means a fully or partially automated system used to suggest, promote, or rank content, including other users, hashtags, or posts, based on the personal data of users. A recommendation system that suggests, promotes, or ranks content based solely on the user’s language, city or town, or age shall not be considered a personalized recommendation system. ‘‘Personalized recommendation system’’ means a fully or partially automated system used to suggest, promote, or rank content, including other users or posts, based on the personal data of users
Restriction A covered platform must provide tools to control personalized recommendation systems, including the ability for a minor to have at least 1 of the following options—

(i) opt out of such personalized recommendation systems, while still allowing the display of content based on a chronological format; or

(ii) limit types or categories of recommendations from such systems;

- - - - -

A platform that uses opaque algorithms must provide transparency and the ability to switch between opaque and input-transparency algorithm

A social media platform shall not use the personal data of a user or visitor in a personalized recommendation system to display content if the platform knows that the user or visitor is a child or teen.

- - - - -

Exception–A social media platform may use a personalized recommendation system to display content to a child or teen if the system only uses the following personal data of the child or teen:

(A) The type of device used by the child or teen.

(B) The languages used by the child or teen to communicate.

(C) The city or town in which the child or teen is located.

(D) The fact that the individual is a child or teen.

(E) The age of the child or teen

SCOPE
Covered Entities covered platform” means an online platform, online video game, messaging application, or video streaming service that connects to the internet and that is used, or is reasonably likely to be used, by a minor. ‘‘social media platform’’ means a public-facing website, online service, online application, or mobile application that—

(i) is directed to consumers;

(ii) collects personal data;

(iii) primarily derives revenue from advertising or the sale of personal data; and

(iv) as its primary function provides a community forum for user-generated content, including messages, videos, and audio files among users where such content is primarily intended for viewing, resharing, or platform-enabled distributed social endorsement or comment

- - - - -

Exclusions: Educational information, experiences, training, or instruction provided to build knowledge, skills, or a craft, district sanctioned or school-sanctioned learning management systems and school information systems for the purposes of schools conveying content related to the education of students, or services or services on behalf of or in support of an elementary school or secondary school, as such terms are defined in section 8101 of the Elementary and Secondary Education Act of 1965 (20 U.S.C. 7801).

- - - - -

Eyes on the board:

Social media platform”—

(i) means any website, online service, online application, or mobile application that—

(I) serves the public; and

(II) primarily provides a forum for users to communicate user-generated content, including messages, videos, images, and audio files, to other online users

(ii) does not include an online service, application, or website—

(aa) that is non-commercial and primarily designed for educational purposes; and

(bb) the revenue of which is not primarily derived from advertising or the sale of personal data;

Age ‘’child’’ means an individual who is under the age of 13

- - - - -

minor” means an individual who is under the age of 17.

child’’ means an individual under the age of 13

- - - - -

‘‘teen’’ means an individual over the age of 12 and under the age of 17

Thresholds N/A

- - - - -

Note House version: “high impact online company’’ means an online platform or online video game that provides any internet-accessible platform where—

(A) such online platform or online video game generates $2,500,000,000 or more in annual revenue, including the revenue generated by any affiliate of such covered platform; or

(B) such online platform or online video game has 150,000,000 or more global monthly active users for not fewer than 3 of the preceding 12 months on the online product or service of such covered platform; and

(C) such online platform or online video game constitutes an online product or service that is primarily used by users to access or share, user-generated content.

N/A
Knowledge Standard “know” or “knows” means to have actual knowledge or knowledge fairly implied on the basis of objective circumstances

- - - - -

The FTC to put out guidance on how the Commission’s would determination of whether a covered platform “had knowledge fairly implied on the basis of objective circumstances”

‘‘know’’ or ‘‘knows’’ means to have actual knowledge or knowledge fairly implied on the basis of objective circumstances.

- - - - -

In making the determination whether a social media platform has “knowledge fairly implied on the basis of objective circumstances,” the Commission shall rely on competent and reliable evidence, taking into account the totality of the circumstances, including whether a reasonable and prudent person under the circumstances would have known that the user is a child or teen

REQUIREMENTS
Data Minimization N/A If a social media platform or a third party acting on behalf of a social media platform voluntarily collects personal data for the purpose of complying with this title, the social media platform or a third party shall not—

(1) use any personal data collected specifically for a purpose other than for sole compliance with the obligations under this title; or

(2) retain any personal data collected from a user for longer than is necessary to comply with the obligations under this title or than is minimally necessary to demonstrate compliance with this title

Data Protection Assessments N/A (see public report) N/A
DPA Thresholds Public report requirement only applies to platforms that

A) averaged more than 10,000,000 active users on a monthly basis in the United States; and

B) predominantly provides a community forum for user-generated content and discussion, including sharing videos, images, games, audio files, discussion in a virtual setting, or other content, such as acting as a social media platform, virtual reality environment, or a social network service.

N/A
Default Settings For minors: A covered platform shall provide that, the default setting for any safeguard (for parents and minors) shall be the option available on the platform that provides the most protective level of control that is offered by the platform over privacy and safety for that user.

- - - - -

For children: A covered platform shall provide that, the parental tools be enabled by default

N/A
Transparency With respect to safeguards and parental tools, a covered platform shall provide—

(A) information and control options in a clear and conspicuous manner that takes into consideration the differing ages, capacities, and developmental needs of the minors most likely to access the covered platform and does not encourage minors or parents to weaken or disable safeguards or parental controls;

(B) readily-accessible and easy-to-use controls to enable or disable safeguards or parental controls, as appropriate; and

(C) information and control options in the same language, form, and manner.

- - - - -

Prior to a minor’s registration, use, or purchase of a covered platform, the covered platform must provide easily accessible and understandable notice of its policies regarding personal information, safeguards for minors, parental tools, and any heightened risk of harm to minors on the platform, including from personalized recommendation systems

- - - - -

Additional transparency around personalized recommendation systems

- - - - -

A covered entity that facilitates advertising aimed at minors must provide information about the advertised product, how the minor’s personal information was used to target the advertising, and whether a piece of media is an advertisement

N/A
Public Report A covered platform (more than 10,000,000 active users/month) must publish a yearly public report describing the reasonably foreseeable risks of harms to minors and assessing the prevention and mitigation measures taken to address such risk based on an independent, third-party audit:

A) an assessment of the extent to which the platform is likely to be accessed by minors

B) a description of the commercial interests of the covered platform;

C) an accounting, based on the data held by the covered platform, of known minors using the platform, time spent on the platform, and amount of content being accessed

D) an accounting of total reports received regarding, and the prevalence of content related to harms

E) an assessment of the reasonably foreseeable risk of harms to minors posed by the covered platform

N/A
Safeguards/tools Minor and Parent tools:

(A) limit the ability of other users or visitors to communicate with the minor;

(B) prevent other users or visitors, whether registered or not, from viewing the minor’s personal data collected by or shared on the covered platform, in particular restricting public access to personal data;

(C) limit design features that encourage or increase the frequency, time spent, or activity of minors on the covered platform

(D) control personalized recommendation systems

(E) restrict the sharing of the geolocation of the minor and provide notice regarding the tracking of the minor’s geolocation.

- - - - -

Must provide users with the option to

(a) delete the minor's account and delete any personal data collected from, or shared by, the minor on the covered platform; or

(b) limit the amount of time spent by the minor on the covered platform.

No children under 13:

(A) a social media platform shall not permit an individual to create or maintain an account or profile if it knows that the individual is a child

(B) A social media platform shall terminate any existing account or profile of a user who the social media platform knows is a child.

(C) upon termination of an existing account, a social media platform shall immediately delete all personal data collected from the user

Individual Rights Parental tools:

(A) manage a minor’s privacy and account settings in a manner that allows parents to—

(i) view the privacy and account settings; and

(ii) change and control the privacy and account settings (for children);

(B) restrict purchases and financial transactions by the minor; and

(C) view metrics of total time spent on the covered platform and restrict time spent.

Access: a user of an existing account that has been terminated may request a copy of their personal data within 90 days of termination
Research The FTC must contract with researchers to conduct at least 5 scientific, comprehensive studies and reports on the risk of harms to minors by use of social media and other online platforms addressing

(1) Anxiety, depression, eating disorders, and suicidal behaviors.

(2) Substance use disorders and the use of narcotic drugs, tobacco products, gambling, or alcohol by minors.

(3) Sexual exploitation and abuse.

(4) Addiction-like use of social media and design factors that lead to unhealthy and harmful overuse of social media.

- - - - -

Another study evaluating the most technologically feasible methods and options for developing systems to verify age at the device or operating system level.

N/A
Duty of Care A covered platform shall exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate the following harms to minors:

(1) Consistent with evidence-informed medical information, the following mental health disorders: anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors.

(2) Patterns of use that indicate or encourage addiction-like behaviors by minors.

(3) Physical violence, online bullying, and ongoing harassment of the minor.

(4) Sexual exploitation and abuse of minors.

(5) Promotion and marketing of narcotic drugs (as defined in section 102 of the Controlled Substances Act (21 U.S.C. 802)), tobacco products, gambling, or alcohol.

(6) Predatory, unfair, or deceptive marketing practices, or other financial harms.

N/A
Age Estimation See knowledge standard See knowledge standard
RESTRICTIONS
Dark Patterns A covered platform must provide tools to limit design features that encourage or increase the frequency, time spent, or activity of minors on the covered platform, such as infinite scrolling, auto playing, rewards for time spent on the platform, notifications, and other design features that result in compulsive usage of the covered platform by the minor;

- - - - -

‘‘design feature’’ means any feature or component of a covered platform that will encourage or increase the frequency, time spent, or activity of minors on the covered platform. Design features include but are not limited to—

(A) infinite scrolling or auto play;

(B) rewards for time spent on the platform;

(C) notifications;

(D) personalized recommendation systems;

(E) in-game purchases; or

(F) appearance altering filters.

- - - - -

A covered platform may not use dark patterns with respect to safeguards or parental controls

N/A
Targeted Advertising See algorithms

- - - - -

individual-specific advertising to minors” means advertising or any other effort to market a product or service that is directed to a specific minor or a device that is linked or reasonably linkable to a minor based on—

(i) the personal data of—

(I) the minor; or

(II) a group of minors who are similar in sex, age, income level, race, or ethnicity to the specific minor to whom the product or service is marketed;

(ii) profiling of a minor or group of minors; or

(iii) a unique identifier of the device

See algorithms
Profiling See targeted advertising N/A
Geolocation data A covered platform must provide tools to restrict the sharing of the geolocation of the minor and provide notice regarding the tracking of the minor’s geolocation. N/A
Algorithms A covered platform must provide tools to control personalized recommendation systems, including the ability for a minor to have at least 1 of the following options—

(i) opt out of such personalized recommendation systems, while still allowing the display of content based on a chronological format; or

(ii) limit types or categories of recommendations from such systems;

- - - - -

personalized recommendation system” means a fully or partially automated system used to suggest, promote, or rank content, including other users, hashtags, or posts, based on the personal data of users. A recommendation system that suggests, promotes, or ranks content based solely on the user’s language, city or town, or age shall not be considered a personalized recommendation system.

- - - - -

A platform that uses opaque algorithms must provide transparency and the ability to switch between opaque and input-transparency algorithm

A social media platform shall not use the personal data of a user or visitor in a personalized recommendation system to display content if the platform knows that the user or visitor is a child or teen, unless the system only uses the following personal data of the child or teen:

(A) The type of device used by the child or teen.

(B) The languages used by the child or teen to communicate.

(C) The city or town in which the child or teen is located.

(D) The fact that the individual is a child or teen.

(E) The age of the child or teen

- - - - -

‘‘personalized recommendation system’’ means a fully or partially automated system used to suggest, promote, or rank content, including other users or posts, based on the personal data of users