Comparing the House's KOSA with Senate's KOSPA

On 7/30/24, the Senate passed the Kids Online Safety and Privacy Act (KOSPA), which incorporates two major student and child privacy bills–the Kids Online Safety Act (KOSA) and the Children and Teen’s Online Privacy Protection Act (COPPA 2.0)–into the Eliminate Useless Reports Act of 2024. On 9/17/24, the House posted an amendment in the nature of a substitute from Representative Bilirakis to their previously introduced version of the Kids Online Safety Act (KOSA).

Below is our redline comparing KOSPA as it passed the Senate to the House version of KOSA (as updated in the Amendment in the Nature of a Substitute for KOSA on 9/17).

Kids Online Safety Act (KOSA) Redline

TITLE I—KIDS ONLINE SAFETY KEEPING KIDS SAFE ONLINE

SEC. 101. DEFINITIONS.

In this subtitle:

(1) CHILD.—The term ‘‘child’’ means an individual who is under the age of 13.

(2) COMPULSIVE USAGE.—The term ‘‘compulsive usage’’ means a persistent and repetitive use of a covered platform that substantially limits 1 or more major life activities (as described in section 3(2) of the Americans with Disabilities Act of 1990 (42 U.S.C. 12102(2))) of an individual, including eating, sleeping, learning, reading, concentrating, thinking, communicating, and working.any response stimulated by external factors that causes an individual to engage in repetitive behavior reasonably likely to cause psychological distress.

(3) COVERED PLATFORM.—

(A) IN GENERAL.—The term ‘‘covered platform’’ means an online platform, online video game, messaging application, or video streaming service that connects to the internet and that is used, or is reasonably likely to be used, by a minor.

(B) EXCEPTIONS.—The term ‘‘covered platform’’ does not include any of the following

(i) an entity acting in its capacity as a provider of any of the following

(I) a common carrier service subject to (as defined in section 3 of the Communications Act of 1934 (47 U.S.C. 1531 et seq.). and all Acts amendatory thereof and supplementary thereto;

(II) a broadband internet access service (as such term is defined in for purposes of section 8.1(b) of title 47, Code of Federal Regulations, or any successor regulation);

(III) an email service;

(IV) a teleconferencing or video conferencing service that allows reception and transmission of audio or video signals for real-time communication, if, provided that

(aa) the service is not an online platform; , including a social media service or social network; and

(bb) the real-time communication is initiated by using a unique link or identifier to facilitate access; or

(V) a wireless messaging service, including such a service provided through short messaging service or multimedia messaging service protocols, that is not a component of, or linked to, an online platform and in which where the predominant or exclusive function is direct messaging consisting of the transmission of text, photos or videos that is are sent by electronic means, in which where messages are transmitted from the sender to a recipient, and are not posted within an online platform or publicly;

(ii) an organization not organized to carry on business for its own profit or that of its members;

(iii) any public or privatepreschool, elementary, or secondary school, or any institution of vocational, professional, or higher education;

(I) early childhood education program or preschool that provides for the care, development, and education of infants, toddlers, or young children who are not yet enrolled in kindergarten;

(II) elementary school (as defined in section 8101 of the Elementary and Secondary Education Act of 1965 (20 U.S.C. 7801)) or secondary school (as so defined);

(III) school providing career and technical education (as defined in section 3 of the Carl D. Perkins Career and Technical Education Act of 2006 (20 U.S.C. 2302));

(IV) school providing adult education and literacy activities (as defined in section 203 of the Adult Education and Family Literacy Act (29 U.S.C. 3272)); or

(V) institution of higher education (as defined in section 101, and subparagraphs (A) and (B) of section 102(a)(1), of the Higher Education Act of 1965 (20 U.S.C. 1001, 1002(a)(1))).

(iv) a library (as defined in section 213(1) of the Library Services and Technology Act (20 U.S.C. 9122(1)));

(v) A government entity with a .gov internet domain (as described in section 2215 of the Homeland Security Act of 2002 (6 U.S.C. 665)).

(vi) a news or sports coverage website or app, including sports news and coverage, entertainment news, or other journalistic news coverage in which where

(I) the inclusion of video content on the website or app is related to the website or app’s own gathering, reporting, or publishing by the website or app of such of such news content or sports coverage;; and

(II) the website or app is not otherwise an online platform;

(vii) a product or service that primarily functions as business-to-business software, a cloud storage, file sharing, or file collaboration service., provided that the product or service is not an online platform; or

(viii) a virtual private network or similar service that exists predominantly solely to route internet traffic between locations.

(ix) A travel website or app that includes user reviews or other travel information.

(4) DESIGN FEATURE.—

(A) IN GENERAL.—The term "design feature"

(i) means any feature or component of a covered platform that will encourages or increases the frequency, time spent, or activity of a minors on the covered platform and . Design features include but are not limited to—

(ii) includes—

(I A) infinite scrolling or auto play;

(II B) a rewards or incentive for the frequency of visits to the covered platform or for the amount of time spent or activities performed on the covered platform;

(III C) a notifications;

(IV) a push alert that urges a user to spend more time engaged with the covered platform when they are not actively using the covered platform;

(V) a a badge or other visual award symbol based on elevated levels of engagement with the covered platform:

(VI D) a personalized recommendation systems;

(VII E) an in-game purchases; and or

(VIII F) an appearance altering filters.

(B) PROHIBITION.—A government entity may not enforce this title or a regulation promulgated under this title based upon a specific viewpoint of any speech, expression, or information protected by the First Amendment to the Constitution that may be made available to a user as a result of the operation of a design feature.

(5) GEOLOCATION.—The term “geolocation” has the meaning given the term “geolocation information” in section 1302 of the Children's Online Privacy Protection Act of 1998 (15 U.S.C. 6501), as added by section 201(a).

(5) HARASSMENT.—The term ‘‘harassment’’ means a criminal threat made or perpetuated against a specific minor that involves or alludes to the use of physical violence or unlawful conduct such that the threat constitutes a misdemeanor or felony violation of Federal criminal law.

(6) HIGH IMPACT ONLINE COMPANY.— The term ‘‘high impact online company’’ means an online platform or online video game that provides any internet-accessible platform in which—

(A) the online platform or online video game constitutes an online product or service that is primarily used by users to access or share, user-generated content; and

(B) the online platform or online video game—

(i) generates $1,000,000,000 or more in annual revenue, including the revenue generated by any affiliate of such covered platform; or

(ii) has 100,000,000 or more global monthly active users for not fewer than 3 of the preceding 12 months.

(7 6) KNOW; OR KNOWS.—

The term ‘‘know’’ or ‘‘knows’’ means, with respect to knowledge that an individual is a child or minor— to have actual knowledge or knowledge fairly implied on the basis of objective circumstances.

(A) with respect to a high impact online company, that the platform knew or should have known the individual was a child or minor;

(B) with respect to a covered platform that has an annual gross revenue of $200,000,000 or more, collects the personal information of 200,000 individuals or more, and is not a high impact online company, that the covered platform knew or acted in willful disregard of the fact that the individual was a child or minor; and

(C) with respect to a covered platform that is not covered by subparagraph (A) or (B), actual knowledge.

(7) MENTAL HEALTH DISORDER.—The term ‘‘mental health disorder’’ has the meaning given the term ‘‘mental disorder’’ in the Diagnostic and Statistical Manual of Mental Health Disorders, 5th Edition (or the most current successor edition).

(7 8) MICROTRANSACTION.—The term ‘‘microtransaction’’—

(A) IN GENERAL.—The term “microtransaction” means a purchase made in an online video game (including a purchase made using a virtual currency that is purchasable or redeemable using cash or credit or that is included as part of a paid subscription service).

(B) INCLUSIONS.—Such term includes a purchase involving a surprise mechanics, new characters, or in-game items; and

(C) EXCLUSIONS.—Such terms does not include—

(i) a purchase made in an online video game using a virtual currency that is earned through gameplay and is not otherwise purchasable or redeemable using cash or credit or included as part of a paid subscription service; or

(ii) a purchase of an additional levels within the game or an overall expansion of the game.

(9) MINOR.—The term ‘‘minor’’ means an individual who is under the age of 17.

(10) NARCOTIC DRUG.—The term ‘‘narcotic drug’’ has the meaning given such term and the term ‘‘controlled substance’’ in section 102 of the Controlled Substances Act (21 U.S.C. 802).

(11 10) ONLINE PLATFORM.—The term ‘‘online platform’’

(A) means any public-facing website, online service, online application, or mobile application that predominantly provides a community forum for user generated content; , such as sharing videos, images, games, audio files, or other content, including a social media service, social network, or virtual reality environment.

(B) includes any such website, service, or application that shares videos, images, games, audio files, or other content, including a social media service, social network, or virtual reality environment; and

(C) does not include any chat, comment, or other interactive functionality of a community forum that is incidental to the predominant purpose of the website, online service, online application, or mobile application.

(12 11) ONLINE VIDEO GAME.—The term ‘‘online video game’’ means a video game, including an educational video game, that connects to the internet and that allows a user to—

(A) create and upload content other than content that is incidental to gameplay, such as a character or level designs created by the user, preselected phrases, or short interactions with other users;

(B) engage in microtransactions within the game; or

(C) communicate with other users.

(13 12) PARENT.—The term ‘‘parent’’ includes a legal guardian. has the meaning given that term in section 1302 of the Children’s Online Privacy Protection Act (15 U.S.C. 6501).

(14 13) PERSONAL DATA.—The term ‘‘personal data’’ has the same meaning given as the term ‘‘personal information’’ as defined in section 1302 of the Children’s Online Privacy Protection Act of 1998 (15 U.S.C. 6501).

(15 14) PERSONALIZED RECOMMENDATION SYSTEM.—The term ‘‘personalized recommendation system’—

(A) means a fully or partially automated system used to suggest, promote, or rank, or recommend content, including other users, hashtags, or posts, based on the personal data of users.; and . A recommendation system that suggests, promotes, or ranks content based solely on the user’s language, city or town, or age shall not be considered a personalized recommendation system.

(B) does not include—

(i) a system that suggests, promotes, or ranks content based solely on the language, city or town, or age of the user;

(ii) technical means that do not fully automate or replace human decision-making processes;

(iii) technical means that are designed to block, detect, identify, or prevent a user from accessing unlawful content; or

(iv) technical means designed to prevent or detect fraud, malicious conduct, other illegal activity, or preserve the integrity or security of systems, products, or services.

(16) SERIOUS EMOTIONAL DISTURBANCE.—The term ‘‘serious emotional disturbance’’ means, with respect to a minor, the presence of a diagnosable mental, behavioral, or emotional disorder in the past year, which resulted in functional impairment that substantially interferes with or limits the minor’s role or functioning in family, school, or community activities.

(175) SEXUAL EXPLOITATION AND ABUSE.—The term “sexual exploitation and abuse” means any of the following:

(A) Coercion and enticement, as described in section 2422 of title 18, United States Code.

(B) Child sexual abuse material, as described in sections 2251, 2252, 2252A, and 2260 of title 18, United States Code.

(C) Trafficking for the production of images, as described in section 2251A of title 18, United States Code.

(D) Sex trafficking of children, as described in section 1591 of title 18, United States Code.

(18) STATE.—The term ‘‘State’’ means each State of the United States, the District of Columbia, each commonwealth, territory, or possession of the United States, and each federally recognized Indian Tribe.

(196) USER.—The term “user” means, with respect to a covered platform, an individual who registers an account or creates a profile on the covered platform.

SEC. 102. Duty of care.

(a) Prevention of harm to minors.—A high impact online company covered platform shall create and implement its exercise reasonable care in the creation and implementation of any design features to reasonably prevent and mitigate the following harms to minors:

(1) Consistent with evidence-informed medical information, the following mental health disorders: anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors.

(2) Patterns of use that indicate or encourage addiction-like behaviors by minors.

(13) Physical violence (as the term “crime of violence’’ is defined in section 16 of title 18, United States Code), online bullying, and harassment of the minor.

(24) Sexual exploitation and abuse of minors.

(35) Promotion and marketing of narcotic drugs (as defined in section 102 of the Controlled Substances Act (21 U.S.C. 802)), tobacco products, gambling, or alcohol.

(4) Promotion of inherently dangerous acts that are likely to cause serious bodily harm, serious emotional disturbance, or death.

(5) With respect to a user the high impact online company knows is a minor, compulsive usage.

(6) With respect to a user the high impact online company knows is a minor, the marketing of tobacco products, gambling, or alcohol.

(6) Predatory, unfair, or deceptive marketing practices, or other financial harms.

(b) Considerations.—The design features described under subsection (a) shall be appropriate to the nature and scope of the activities engaged in by the high impact online company.

(c) Rule of Construction. Limitation.—Nothing in subsection (a) shall be construed to require a high impact online company covered platform to prevent or preclude any minor from

(1) any minor from deliberately and independently searching for, or specifically requesting, content; or

(2) the high impact online company or individuals on the platform from providing accessingresources for and information regarding the prevention or mitigation of the harms described in subsection (a), including evidence-informed information and clinical resources.

SEC. 103. Safeguards for minors.

(a) Safeguards for minors.—

(1) SAFEGUARDS.—Except as provided in paragraph (4), A a covered platform shall provide a user or visitor that the covered platform knows is a minor with readily-accessible and easy-to-use safeguards to, as applicable, do the following:

(A) limit the ability of other users or visitors to communicate with the minor;

(B) prevent other users or visitors, whether registered or not, from viewing the minor’s personal data collected by or shared on the covered platform, in particular restricting public access to personal data;

(BC) limit design features, that encourage or increase the frequency, time spent, or activity of minors on the covered platform, such as infinite scrolling, auto playing, rewards or incentives for the frequency of visits to the covered platform or for time spent on the covered platform, notifications, badges, push alerts, and other design features that result in compulsive usage of the covered platform by the minor;

(CD) control personalized recommendation systems, including the ability for a minor to have at least 1 of the following options

(i) the option of opting in to out of such personalized recommendation systems, while still allowing the display of content based on a chronological format; or

(ii) the option of limiting types or categories of recommendations from such systems; and

(E) restrict the sharing of the geolocation of the minor and provide notice regarding the tracking of the minor’s geolocation.

(D) Limit the ability to make in-game purchases and microtransactions and time spent on online video games, and, in the case of an online video game, compliance with this subparagraph shall be considered to be compliance with subparagraph (B).

(2) OPTIONS.—A covered platform shall provide a user that the covered platform knows is a minor with a readily-accessible and easy-to-use options to limit the amount of time spent by the minor on the covered platform.

(3) DEFAULT SAFEGUARD SETTINGS FOR MINORS.—A covered platform shall provide that, in the case of a user or visitor that the platform knows is a minor, the default setting for any safeguard described under paragraph (1) shall be the option available on the platform that provides the most protective level of control that is offered by the platform over privacy and safety for that user or visitor, unless otherwise enabled by the parent of the minor.

(4) EXCEPTION.—Notwithstanding paragraph (1), a covered platform shall provide the safeguards described in paragraph (1)(C) to the educational agency or institution (as defined in section 444(a)(3) of the General Education Provisions Act (20 U.S.C. 1232g(a)(3)), rather than to the user or visitor, when the covered platform is acting on behalf of an educational agency or institution (as so defined), subject to a written contract that complies with the requirements of the Children’s Online Privacy Protection Act of 1998 (15 U.S.C. 6501 et seq.) and section 444 of the General Education Provisions Act (20 U.S.C. 1232g) (commonly known as the ‘‘Family Educational Rights and Privacy Act of 1974’’).

(b) Parental tools.—

(1) TOOLS.—A covered platform shall provide readily-accessible and easy-to-use parental tools settings for parents to support a user that the platform knows is a minor with respect to the user's use of the platform by the user.

(2) REQUIREMENTS.—The parental tools provided by a covered platform under paragraph (1) shall include—

(A) the ability to manage the a minor’s privacy and account settings of a minor, including the safeguards and options established under subsection (a), in a manner that allows parents to—

(i) view the privacy and account settings; and

(ii) in the case of a user that the platform knows is a child, change and control the privacy and account settings;

(B) the ability to restrict purchases and financial transactions by the minor, where applicable; and

(C) the ability to view metrics of total time spent on the covered platform and restrict time spent on the covered platform by the minor.

(3) NOTICE TO MINORS.—A covered platform shall provide clear and conspicuous notice to a user when the tools described in this subsection are in effect and what settings or controls have been applied.

(4) DEFAULT TOOLS.—A covered platform shall provide that, in the case of a user that the platform knows is a child, the tools required under paragraph (1) shall be enabled by default.

(5) APPLICATION TO EXISTING ACCOUNTS.—If, prior to the effective date of this subsection, a covered platform provided a parent of a user that the platform knows is a child with notice and the ability to enable the parental tools described under this subsection in a manner that would otherwise comply with this subsection, and the parent opted to disable out of enabling such tools, the covered platform is not required to enable such tools with respect to such user by default when this subsection takes effect.

(c) Reporting mechanism.—

(1) REPORTS SUBMITTED BY PARENTS AND, MINORS, AND SCHOOLS.—A covered platform shall provide—

(A) a readily-accessible and easy-to-use means for a parent or minor to submit a reports to the covered platform of any harms to a minor related to the use by the minor of the platform;

(B) an electronic point of contact specific to matters involving harms to a minor; and

(C) confirmation of the receipt of such a report submitted under subparagraph (A) and, within the applicable time period described in paragraph (2), a substantive response to the individual that submitted the report.

(2) TIMING.—A covered platform shall establish an internal process to receive and substantively respond to such reports in a reasonable and timely manner, but in no case later than—

(A) 10 days after the receipt of a report, if, for the most recent calendar year, the platform averaged more than 10,000,000 active users on a monthly basis in the United States;

(B) 21 days after the receipt of a report, if, for the most recent calendar year, the platform averaged less than 10,000,000 active users on a monthly basis in the United States; and

(C) notwithstanding subparagraphs (A) and (B), if the report involves an imminent threat to the safety of a minor, as promptly as needed to address the reported threat to safety.

(d) Advertising of illegal products.—A covered platform shall not facilitate the advertising of narcotic drugs (as defined in section 102 of the Controlled Substances Act (21 U.S.C. 802)), tobacco products, gambling, or alcohol to an individual that the covered platform knows is a minor.

(e) Rules of application.—

(1) ACCESSIBILITY.—With respect to safeguards and parental tools described under subsections (a) and (b), a covered platform shall provide—

(A) information and control options in a clear and conspicuous manner that takes into consideration the differing ages, capacities, and developmental needs of the minors most likely to access the covered platform and does not encourage minors or parents to weaken or disable safeguards or parental tools;

(B) readily-accessible and easy-to-use controls to enable or disable safeguards or parental tools, as appropriate; and

(C) information and control options in the same language, form, and manner as the covered platform provides the product or service used by minors and their parents.

(2) DARK PATTERNS PROHIBITION.—It shall be unlawful for any covered platform to design, embed, modify, or manipulate a user interface of a covered platform with the purpose or substantial effect of obscuring, subverting, or impairing user autonomy, decision-making, or choice with respect to safeguards or parental tools required under this section.

(3) TIMING CONSIDERATIONS.—

(A) NO INTERRUPTION TO GAMEPLAY.—Subsections (a)(1)(C) and (b)(3) shall not require an online video game to interrupt the natural sequence of game play, such as progressing through game levels or finishing a competition.

(B) APPLICATION OF CHANGES TO OFFLINE DEVICES OR ACCOUNTS.—If a user's device or user account does not have access to the internet at the time of a change to parental tools, a covered platform shall apply changes the next time the device or user is connected to the internet.

(f) Device or console controls.—

(1) IN GENERAL.—Nothing in this section shall be construed to prohibit a covered platform from integrating the products or service of the platform with, or duplicate controls or tools provided by, a third-party system (including an operating system or gaming console) to meet the requirements imposed under subsections (a) and (b) relating to safeguards for minors and parental tools, if—

(A) the controls or tools meet such requirements; and

(B) the minor or parent is provided sufficient notice of the integration and use of the controls or tools.

(2) PRESERVATION OF PROTECTIONS.—In the event of a conflict between the controls or tools of a third-party system (including an operating system or gaming console) and a covered platform, the covered platform is not required to override the controls or tools of a third-party system if it would undermine the protections for minors from the safeguards or parental tools imposed under subsections (a) and (b).

(g) (4) RULES OF CONSTRUCTION.—Nothing in this section shall be construed to do the following—

(1A) prevent a covered platform from taking reasonable measures to—

(Ai) block, detect, or prevent the distribution of unlawful, obscene, or other harmful material to minors as described in section 102(a); or

(Bii) block or filter spam, prevent criminal activity, or protect the security of a platform or service;

(2B) require the disclosure of the a minor's browsing behavior, search history, messages, contact list, or other content or metadata of the their communications of a minor;

(3C) prevent a covered platform from using a personalized recommendation system to display content to a minor if the system only uses information on—

(Ai) the language spoken by the minor;

(Bii) the city the minor is located in; or

(Ciii) the minor's age of the minor; or

(4D) prevent an online video game from disclosing a username or other user identification for the purpose of competitive gameplay or to allow for the reporting of users.

(f) Device or console controls.—

(1) IN GENERAL.—Nothing in this section shall be construed to prohibit a covered platform from integrating its products or service with, or duplicate controls or tools provided by, third-party systems, including operating systems or gaming consoles, to meet the requirements imposed under subsections (a) and (b) relating to safeguards for minors and parental tools, provided that—

(A) the controls or tools meet such requirements; and

(B) the minor or parent is provided sufficient notice of the integration and use of the parental tools.

(2) PRESERVATION OF PROTECTIONS.—In the event of a conflict between the controls or tools of a third-party system, including operating systems or gaming consoles, and a covered platform, the covered platform is not required to override the controls or tools of a third-party system if it would undermine the protections for minors from the safeguards or parental tools imposed under subsections (a) and (b).

SEC. 104. Disclosure.

(a) Notice.—

(1) REGISTRATION OR PURCHASE.—Prior to registration or a purchase on of a covered platform by an individual that the platform knows is a minor, the platform shall provide clear, conspicuous, and easy-to-understand—

(A) notice of the policies and practices of the covered platform with respect to safeguards for minors required under section 103;

(B) information about how to access the safeguards and parental tools required under section 103; and

(C) notice about whether the covered platform uses or makes available to minors a product, service, or design feature, including any personalized recommendation system, that poses any heightened risk of harm to minors.

(2) NOTIFICATION.—

(A) NOTICE AND ACKNOWLEDGMENT.—In the case of an individual that a covered platform knows is a child, the platform shall additionally provide information about the parental tools and safeguards required under section 103 to a parent of the child and obtain verifiable consent (as defined in section 1302(9) of the Children's Online Privacy Protection Act of 1998 (15 U.S.C. 6501(9))) from the parent prior to the initial use of the covered platform by the child.

(B) REASONABLE EFFORT.—A covered platform shall be deemed to have satisfied the requirement described in subparagraph (A) if the covered platform is in compliance with the requirements of the Children's Online Privacy Protection Act of 1998 (15 U.S.C. 6501 et seq.) to use reasonable efforts (taking into consideration available technology) to provide a parent with the information described in subparagraph (A) and to obtain verifiable consent as required.

(3) CONSOLIDATED NOTICES.—For purposes of this subtitle, a covered platform may consolidate the process for providing information under this subsection and obtaining verifiable consent or the consent of the minor involved (as applicable) as required under this subsection with the its obligations of the covered platform to provide relevant notice and obtain verifiable consent under the Children's Online Privacy Protection Act of 1998 (15 U.S.C. 6501 et seq.).

(4) GUIDANCE.—The Federal Trade Commission may issue guidance to assist covered platforms in complying with the specific notice requirements of this subsection.

(b) Personalized recommendation system.—A covered platform that operates a personalized recommendation system describe in the shall set out in its terms and conditions of the covered platform, in a clear, conspicuous, and easy-to-understand manner—

(1) an overview of how such personalized recommendation system is used by the covered platform to provide information to minors, including how such systems use the personal data of minors; and

(2) information about options for minors or their parents to opt out of or control the personalized recommendation system (as applicable).

(c) Advertising and marketing information and labels.—

(1) INFORMATION AND LABELS.—A covered platform shall provide clear, conspicuous, and easy-to-understand labels and information, which can be provided through a link to another web page or disclosure, to minors on advertisements regarding—

(A) the name of the product, service, or brand and the subject matter of an advertisement; and

(B) whether particular media displayed to the minor is an advertisement or marketing material, including disclosure of endorsements of products, services, or brands made for commercial consideration by other users of the platform.

(2) GUIDANCE.—The Federal Trade Commission may issue guidance to assist covered platforms in complying with the requirements of this subsection, including guidance about the minimum level of information and labels for the disclosures required under paragraph (1).

(cd) Resources for parents and minors.—A covered platform shall provide to minors and parents clear, conspicuous, easy-to-understand, and comprehensive information in a prominent location, which may include a link to a web page, regarding—

(1) its policies and practices of the covered platform with respect to safeguards for minors required under section 103; and

(2) how to access the safeguards and parental tools required under section 103.

(de) Resources in additional languages.—A covered platform shall ensure, to the extent practicable, that the disclosures required by this section are made available in the same language, form, and manner as the covered platform provides any product or service used by minors and their parents.

SEC. 105. Transparency.

(a) In general.—Subject to subsection (b), not less frequently than once a year, a covered platform shall issue a public report that describes describing the reasonably foreseeable risks of harms to minors and assesses ing the prevention and mitigation measures taken to address such risks based on an independent, third-party audit of the covered platform with a reasonable level of assurance conducted through reasonable inspection of the covered platform.

(b) Scope of application.—The requirements of this section shall apply to a covered platform if—

(1) for the most recent calendar year, the platform averaged more than 10,000,000 active users on a monthly basis in the United States; and

(2) the platform predominantly provides a community forum for user-generated content and discussion, including sharing videos, images, games, audio files, discussion in a virtual setting, or other content, such as acting as a social media platform, virtual reality environment, or a social network service.

(c) Content.—

(1) TRANSPARENCY.—The public reports required under subsection (a) of a covered platform under this section shall include the following

(A) an assessment of the extent to which the platform is likely to be accessed by minors;

(B) a description of the commercial interests of the covered platform being used in use by minors;

(C) an accounting, based on the data held by the covered platform, of—

(i) the number of users using the covered platform that the platform knows to be minors in the United States;

(ii) the median and mean amounts of time spent on the platform by users known to be minors in the United States who have accessed the platform during the reporting year on a daily, weekly, and monthly basis; and

(iii) the amount of content being accessed by users that the platform knows to be minors in the United States that is in English, and the top 5 non-English languages used by users accessing the platform in the United States;

(D) an accounting of total reports received regarding, and the prevalence (which can be based on scientifically valid sampling methods using the content available to the covered platform in the normal course of business) of content related to, the harms described in section 102(a) through the reporting mechanism described in section 103, disaggregated by category of harm and language, including English and the top 5 non-English languages used by users accessing the platform from the United States (as identified under subparagraph (C)(iii)); and

(E) a description of any material breaches of the requirement to provide safeguards or parental tools under section 103 or assurances regarding minors, representations regarding the use of the personal data of minors, and other matters regarding non-compliance with this subtitle.

(2) EVALUATION. REASONABLY FORESEEABLE RISK OF HARM TO MINORS.—The public reports required of a covered platform under this subsection(a) shall include the following

(A) an assessment of the reasonably foreseeable risk of harms to minors based on aggregate data on the exercise of safeguards and parental tools described in section 103, and other competent and reliable empirical evidence. posed by the covered platform, specifically identifying those physical, mental, developmental, or financial harms described in section 102(a);

(B) a description of whether and how the covered platform uses design features that encourage or increase, sustain, or extend use of a product or service by a minor, the frequency, time spent, or activity of minors on the covered platform, such as infinite scrolling, automatic playing of media, rewards for time spent on the platform, and notifications, and other design features that result in compulsive usage of the covered platform by the minor;

(C) a description of whether, how, and for what purpose the platform collects or processes categories of personal data related that may cause reasonably foreseeable risk of harms to minors;

(D) an evaluation of the efficacy of safeguards for minors and parental tools under section 103, and any issues in delivering such safeguards and the associated parental tools;

(E) an evaluation of any other relevant matters of public concern over risk of harms to minors associated with the use of the covered platform; and

(EF) an assessment of differences, with respect to the matters described in subparagraphs (A) through (D), in risk of harm to minors across different English and non-English languages and efficacy of safeguards in those languages.

(3) MITIGATION.—The public reports required of a covered platform under thissub section(a) shall include, for English and the top 5 non-English languages used by users accessing the platform from the United States (as identified under paragraph (2)(C)(iii)))—

(A) a description of the safeguards and parental tools available to minors and parents on the covered platform;

(B) a description of interventions by the covered platform when it had or has reason to believe that harms to minors could occur;

(BC) a description of the prevention and mitigation measures a covered platform may take, if any, in response to the assessments conducted under paragraph (2), intended to be taken in response to the known and emerging risks identified in its assessment of reasonably foreseeable risks of harms to minors, including steps taken to provide the most protective level of control over safety by default—

(i) prevent harms to minors, including adapting or removing design features or addressing through parental tools;

(ii) provide the most protective level of control over privacy and safety by default; and

(iii) adapt recommendation systems to mitigate reasonably foreseeable risk of harms to minors, as described in section 102(a);

(D) a description of internal processes for handling reports and automated detection mechanisms for harms to minors, including the rate, timeliness, and effectiveness of responses under the requirement of section 103(c);

(C) a description of the processes used for the creation and implementation of any design feature that will be used by minors

(DE) the status of implementing prevention and mitigation measures identified in prior assessments; and

(F) a description of the additional measures to be taken by the covered platform to address the circumvention of safeguards for minors and parental tools.

(d) Reasonable inspection.—In conducting an inspection of the reasonably foreseeable risk of harm to minors under this section, the an independent, third-party auditor described under subsection (a) shall do the following

(1) take into consideration the function of personalized recommendation systems;

(2) consult parents and youth experts, including youth and families with relevant past or current experience, public health and mental health nonprofit organizations, health and development organizations, and civil society with respect to the prevention of harms to minors;

(3) conduct research based on experiences of minors that use the covered platform, including reports under section 103(c) and information provided by law enforcement;

(4) take account of research, including research regarding design features, marketing, or product integrity, industry best practices, or outside research;

(5) take into consideration indicia or inferences of age of users, in addition to any self-declared information about the age of users; and

(6) take into consideration differences in risk of reasonably foreseeable harms and effectiveness of safeguards across English and non-English languages.

(e) Cooperation with independent, third-party audit.—To facilitate the report required by subsection (c), a covered platform shall—

(1) provide or otherwise make available to the independent third-party conducting the audit all information and material in the its possession, custody, or control of the platform that is relevant to the audit;

(2) provide or otherwise make available to the independent third-party conducting the audit access to all network, systems, and assets relevant to the audit; and

(3) disclose all relevant facts to the independent third-party conducting the audit, and not misrepresent in any manner, expressly or by implication, any relevant fact.

(f) Privacy safeguards.—

(1) IN GENERAL.—In issuing the public reports required under this subsection(a), a covered platform shall take steps to safeguard the privacy of the its users of the platform, including ensuring that data is presented in a de-identified, aggregated format such that it is not reasonably linkable to any user.

(2) RULE OF CONSTRUCTION.—Nothing in this This section shall not be construed to require covered platforms to disclose the disclosure of information thatwill lead to material vulnerabilities for the privacy of users or the security of a covered platform's service or create a significant risk of the violation of Federal or State law.

(A) will lead to material vulnerabilities for—

(i) the privacy of users;

(ii) the trade secrets or other protected intellectual property, including any privileged, proprietary, or confidential commercial information, of a covered platform; or

(iii) the security of the service of a covered platform; or

(B) will create a significant risk of the violation of Federal or State law

(3) DEFINITION OF DE-IDENTIFIED.—As used in this subsection, the term “de-identified” means data that does not identify and is not linked or reasonably linkable to a device that is linked or reasonably linkable to an individual, regardless of whether the information is aggregated

(g) Location.—The public reports required under this section should be posted by a covered platform on an easy to find location on a publicly-available website.

SEC. 106. Research on social media and minors.

(a) Definitions.—In this section:

(1) COMMISSION.—The term “Commission” means the Federal Trade Commission.

(2) NATIONAL ACADEMY.—The term “National Academy” means the National Academy of Sciences.

(3) SECRETARY.—The term “Secretary” means the Secretary of Health and Human Services.

(b) Research on social media harms.—Not later than 12 months after the date of enactment of this Act, the Commission shall seek to enter into a contract with the National Academy, under which the National Academy shall conduct no less than 5 scientific, comprehensive studies and reports on the risk of harms to minors by use of social media and other online platforms, including in English and non-English languages.

(c) Matters to be addressed.—In contracting with the National Academy, the Commission, in consultation with the Secretary, shall seek to commission separate studies and reports, using the Commission's authority under section 6(b) of the Federal Trade Commission Act (15 U.S.C. 46(b)), on the relationship between social media and other online platforms as defined in this subtitle on the following matters:

(1) Anxiety, depression, eating disorders, and suicidal behaviors.

(2) Substance use disorders and the use of narcotic drugs, tobacco products, gambling, or alcohol by minors.

(3) Sexual exploitation and abuse.

(4) Addiction-like use of social media and design factors that lead to unhealthy and harmful overuse of social media.

(d) Additional study.—Not earlier than 4 years after enactment, the Commission shall seek to enter into a contract with the National Academy under which the National Academy shall conduct an additional study and report covering the matters described in subsection (c) for the purposes of providing additional information, considering new research, and other matters.

(e) Content of reports.— The comprehensive studies and reports conducted pursuant to this section shall seek to evaluate impacts and advance understanding, knowledge, and remedies regarding the harms to minors posed by social media and other online platforms, and may include recommendations related to public policy.

(f) Active studies.—If the National Academy is engaged in any active studies on the matters described in subsection (c) at the time that it enters into a contract with the Commission to conduct a study under this section, it may base the study to be conducted under this section on the active study, so long as it otherwise incorporates the requirements of this section.

(g) Collaboration.—In designing and conducting the studies under this section, the Commission, the Secretary, and the National Academy shall consult with the Surgeon General and the Kids Online Safety Council.

(h) Access to Data.—

(1) FACT-FINDING AUTHORITY.—The Commission may issue orders under section 6(b) of the Federal Trade Commission Act (15 U.S.C. 46(b)) to require covered platforms to provide reports, data, or answers in writing as necessary to conduct the studies required under this section.

(2) SCOPE.—In exercising its authority under paragraph (1), the Commission may issue orders to no more than 5 covered platforms per study under this section.

(3) CONFIDENTIAL ACCESS.—Notwithstanding section 6(f) or 21 of the Federal Trade Commission Act (15 U.S.C. 46, 57b–2), the Commission shall enter in agreements with the National Academy to share appropriate information received from a covered platform pursuant to an order under such subsection (b) for a comprehensive study under this section in a confidential and secure manner, and to prohibit the disclosure or sharing of such information by the National Academy. Nothing in this paragraph shall be construed to preclude the disclosure of any such information if authorized or required by any other law.

SEC. 106. 107. Market research.

An online platform may not, in the case of a user or visitor that the online platform knows is a minor, conduct market and product-focused research on such minor unless the online platform obtains verifiable parental consent (as defined in section 1302 of the Children’s Online Privacy Protection Act of 1998 (15 U.S.C. 6501)) prior to conducting such research on such minor.

(a) Market research by covered platforms.—The Federal Trade Commission, in consultation with the Secretary of Commerce, shall issue guidance for covered platforms seeking to conduct market- and product-focused research on minors. Such guidance shall include—

(1) a standard consent form that provides minors and their parents a clear, conspicuous, and easy-to-understand explanation of the scope and purpose of the research to be conducted that is available in English and the top 5 non-English languages used in the United States;

(2) information on how to obtain informed consent from the parent of a minor prior to conducting such market- and product-focused research; and

(3) recommendations for research practices for studies that may include minors, disaggregated by the age ranges of 0-5, 6-9, 10-12, and 13-16.

(b) Timing.—The Federal Trade Commission shall issue such guidance not later than 18 months after the date of enactment of this Act. In doing so, they shall seek input from members of the public and the representatives of the Kids Online Safety Council established under section 111.

SEC. 107. 108. Age verification study and report.

Not later than 1 year after the date of enactment of this Act, (a) Study.—Tthe Secretary of Commerce, in coordination with the Federal Communications Commission and the Federal Trade Commission, shall a report containing the results of the study conducted under such subsection to the Committee on Commerce, Science, and Transportation of the Senate and the Committee on Energy and Commerce of the House of Representatives a report that evaluates conduct a study evaluating the most technologically feasible methods and options for developing systems to verify age at the device or operating system level the considers the following:.

(b) Contents.—Such study shall consider —

(1) Tthe benefits of creating a device or operating system level age verification system.;

(2) Wwhat information may need to be collected to create this type of age verification system.;

(3) Tthe accuracy of such systems and their impact or steps to improve accessibility, including for individuals with disabilities.;

(4) Hhow such a system or systems could verify age while mitigating risks to user privacy and data security and safeguarding minors' personal data, emphasizing minimizing the amount of data collected and processed by covered platforms and age verification providers for such a system.;

(5) Tthe technical feasibility, including the need for potential hardware and software changes, including for devices currently in commerce and owned by consumers.; and

(6) Tthe impact of different age verification systems on competition, particularly the risk of different age verification systems creating barriers to entry for small companies.

(c) Report.—Not later than 1 year after the date of enactment of this Act, the agencies described in subsection (a) shall submit a report containing the results of the study conducted under such subsection to the Committee on Commerce, Science, and Transportation of the Senate and the Committee on Energy and Commerce of the House of Representatives.

SEC. 108. 109. Guidance.

(a) In general.—Not later than 18 months after the date of enactment of this Act, the Federal Trade Commission, in consultation with the Kids Online Safety Council established under section 111, shall issue guidance to—

(1) provide information and examples for covered platforms and auditors regarding the following, with consideration given to differences across English and non-English languages—

(A) identifying design features that encourage or increase the frequency, time spent, or activity of minors on the covered platform;

(B) safeguarding minors against the possible misuse of parental tools;

(B C) best practices in providing minors and parents the most protective level of control over privacy and safety;

(C D) using indicia or inferences of age of users for assessing use of the covered platform by minors;

(D E) methods for evaluating the efficacy of safeguards set forth in this subtitle; and

(F) providing additional parental tool options that allow parents to address the harms described in section 102(a); and

(2) outline conduct that does not have the purpose or substantial effect of subverting or impairing user autonomy, decision-making, or choice, or of causing, increasing, or encouraging compulsive usage for a minor, such as—

(A) de minimis user interface changes derived from testing consumer preferences, including different styles, layouts, or text, where such changes are not done with the purpose of weakening or disabling safeguards or parental tools; and

(B) algorithms or data outputs outside the control of a covered platform; and

(B C) establishing default settings that provide enhanced privacy protection to users or otherwise enhance their autonomy and decision-making ability.

(b) Guidance on knowledge standard.—Not later than 18 months after the date of enactment of this Act, the Federal Trade Commission shall issue guidance to provide information, including best practices and examples, for covered platforms to understand how the Commission would determine whether a covered platform “had knowledge fairly implied on the basis of objective circumstances” for purposes of this subtitle.

(b c) Limitation on Federal Trade Commission guidance.—

(1) EFFECT OF GUIDANCE.—No guidance issued by the Federal Trade Commission with respect to this subtitle shall—

(A) confer any rights on any person, State, or locality; or

(B) operate to bind the Federal Trade Commission or any court, person, State, or locality to the approach recommended in such guidance.

(2) USE IN ENFORCEMENT ACTIONS.—In any enforcement action brought pursuant to this subtitle, the Federal Trade Commission or a State attorney general, as applicable—

(A) shall allege a specific violation of a provision of this subtitle; and

(B) may not base such enforcement action on, or, as applicable, execute a consent order based on, practices that are alleged to be inconsistent with guidance issued by the Federal Trade Commission with respect to this subtitle, unless the practices are allegedly to violate a provision of this subtitle.
For purposes of enforcing this subtitle, State attorneys general shall take into account any guidance issued by the Commission under subsection (b).

SEC. 110. Enforcement.

(a) Enforcement by Federal Trade Commission.—

(1) UNFAIR AND DECEPTIVE ACTS OR PRACTICES.—A violation of this subtitle shall be treated as a violation of a rule defining an unfair or deceptive act or practice prescribed under section 18(a)(1)(B) of the Federal Trade Commission Act (15 U.S.C. 57a(a)(1)(B)).

(2) POWERS OF THE COMMISSION.—

(A) IN GENERAL.—The Federal Trade Commission (referred to in this section as the “Commission”) shall enforce this subtitle in the same manner, by the same means, and with the same jurisdiction, powers, and duties as though all applicable terms and provisions of the Federal Trade Commission Act (15 U.S.C. 41 et seq.) were incorporated into and made a part of this subtitle.

(B) PRIVILEGES AND IMMUNITIES.—Any person that violates this subtitle shall be subject to the penalties, and entitled to the privileges and immunities, provided in the Federal Trade Commission Act (15 U.S.C. 41 et seq.).

(3) AUTHORITY PRESERVED.—Nothing in this subtitle shall be construed to limit the authority of the Commission under any other provision of law.

(b) Enforcement by State attorneys general.—

(1) IN GENERAL.—

(A) CIVIL ACTIONS.—In any case in which the attorney general of a State has reason to believe that a covered platform has violated or is violating section 103, 104, or 105, the State, as parens patriae, may bring a civil action on behalf of the residents of the State in a district court of the United States or a State court of appropriate jurisdiction to—

(i) enjoin any practice that violates section 103, 104, or 105;

(ii) enforce compliance with section 103, 104, or 105;

(iii) on behalf of residents of the State, obtain damages, restitution, or other compensation, each of which shall be distributed in accordance with State law; or

(iv) obtain such other relief as the court may consider to be appropriate.

(B) NOTICE.—

(i) IN GENERAL.—Before filing an action under subparagraph (A), the attorney general of the State involved shall provide to the Commission—

(I) written notice of that action; and

(II) a copy of the complaint for that action.

(ii) EXEMPTION.—

(I) IN GENERAL.—Clause (i) shall not apply with respect to the filing of an action by an attorney general of a State under this paragraph if the attorney general of the State determines that it is not feasible to provide the notice described in that clause before the filing of the action.

(II) NOTIFICATION.—In an action described in subclause (I), the attorney general of a State shall provide notice and a copy of the complaint to the Commission at the same time as the attorney general files the action.

(2) INTERVENTION.—

(A) IN GENERAL.—On receiving notice under paragraph (1)(B), the Commission shall have the right to intervene in the action that is the subject of the notice.

(B) EFFECT OF INTERVENTION.—If the Commission intervenes in an action under paragraph (1), the Commission it shall have the right—

(i) to remove the action to the appropriate United States district court;

(ii) to be heard with respect to any matter that arises in that action; and

(iii) to file a petition for appeal.

(3) CONSTRUCTION.—For purposes of bringing any civil action under paragraph (1), nothing in this subtitle shall be construed to prevent an attorney general of a State from exercising the powers conferred on the attorney general by the laws of that State to—

(A) conduct investigations;

(B) administer oaths or affirmations; or

(C) compel the attendance of witnesses or the production of documentary and other evidence.

(4) ACTIONS BY THE COMMISSION.—In any case in which an action is instituted by or on behalf of the Commission for violation of this subtitle, no State may, during the pendency of that action, institute a separate action under paragraph (1) against any defendant named in the complaint in the action instituted by or on behalf of the Commission for that violation.

(5) VENUE; SERVICE OF PROCESS.—

(A) VENUE.—Any action brought under paragraph (1) may be brought in—

(i) the district court of the United States that meets applicable requirements relating to venue under section 1391 of title 28, United States Code; or

(ii) a State court of competent jurisdiction.

(B) SERVICE OF PROCESS.—In an action brought under paragraph (1) in a district court of the United States, process may be served wherever defendant—

(i) is an inhabitant; or

(ii) may be found.

(6) LIMITATION.—A violation of section 102 shall not form the basis of liability in any action brought by the attorney general of a State under a State law.

SEC. 110. 111. Kids online safety council.

(a) Establishment.—There is Not later than 180 days after the date of enactment of this Act, the Secretary of Commerce shall established a and convene the Kids Online Safety Council (in this section referred to as the “Council”).for the purpose of providing advice on matters related to this subtitle.

(b) Duties.—The duties of the Council shall be to provide reports to Congress with recommendations and advice on matters related to the safety of minors online. The matters to be addressed by the Council shall include—

(1) identifying emerging or current harms to minors online;

(2) recommending policies for assessing, preventing, and mitigating harms to minors online;

(3) recommending best practices to promote the health and safety of minors online;

(3) recommending methods and themes for conducting research regarding harms to minors online, including in English and non-English languages; and

(4) recommending best practices and standards for transparency reports required under this title to promote overall accountability.

(c b) Number and Appointment of Members Participation.—The Kids Online Safety Council shall be comprised of 11 members, of whom—include diverse participation from—

(1) 3 members shall be appointed by the President, including—

(A) the Secretary of Commerce or a designee of the Secretary; and

(B) the Secretary of Health and Human Services or a designee of the Secretary;

(2) 2 members shall be appointed by the Speaker of the House of Representatives;

(3) 2 members shall be appointed by the Minority Leader of the House of Representatives;

(4) 2 members shall be appointed by the Majority Leader of the Senate; and

(5) 2 members shall be appointed by the Minority Leader of the Senate.

(d) Timing of Appointments.—Each of the appointments under subsection (c) shall be made not later than 180 days after the date of the enactment of this Act.

(e) Terms; Vacancies.—Each member of the Council shall be appointed for the life of the Council, and a vacancy in the Council shall be filled in the manner in which the original appointment was made.

(f) Chairperson; Vice Chairperson.—Not later than 30 days after the date on which the final member of the Council is appointed—

(1) the Chairperson shall be designated by the Speaker of the House of Representatives; and

(2) the Vice Chairperson shall be designated by the leader of the party within the Senate that is different from the party of the Speaker of the House of Representatives.

(g) Participation.—The Council shall consist of 1 member from each of the following:

(1) Academic experts with specific expertise in the prevention of online harms to minors.

(2) Researchers with specific expertise in social media studies.

(3) Parents with demonstrated experience in child online safety.

(4) Youth representatives with demonstrated experience in child online safety.

(5) Educators with demonstrated experience in child online safety.

(6) Representatives of covered platforms.

(7) Representatives of high impact online companies.

(8) State attorneys general or their designees acting in State or local government.

(9) Representatives of communities of socially disadvantaged individuals (as defined in section 8 of the Small Business Act (15 U.S.C. 637)).

(h) Reports.—

(1) Interim Report.—Not later than 1 year after the date of the initial meeting of the Council, the Council shall submit to Congress an interim report that includes a detailed summary of the work of the Council and any preliminary findings of the Council.

(2) FINAL REPORT.—Not later than 3 years after the date of the initial meeting of the Council, the Council shall submit to Congress a final report that includes—

(A) a detailed statement of the findings and conclusions of the Council;

(B) dissenting opinions of any member of the Council who does not support the findings and conclusions referred to in subparagraph (A); and

(C) any recommendations for legislative and administrative actions to address online safety for children and prevent harms to minors.

(i) TERMINATION.—The Council shall terminate not later than 30 days after the submission of the final report required under subsection (h)(2).

academic experts, health professionals, and members of civil society with expertise in mental health, substance use disorders, and the prevention of harms to minors;

(2) representatives in academia and civil society with specific expertise in privacy, free expression, access to information, and civil liberties;

(3) parents and youth representation;

(4) representatives of covered platforms;

(5) representatives of the National Telecommunications and Information Administration, the National Institute of Standards and Technology, the Federal Trade Commission, the Department of Justice, and the Department of Health and Human Services;

(6) State attorneys general or their designees acting in State or local government;

(7) educators; and

(8) representatives of communities of socially disadvantaged individuals (as defined in section 8 of the Small Business Act (15 U.S.C. 637)).

(c) Activities.—The matters to be addressed by the Kids Online Safety Council shall include—

(1) identifying emerging or current risks of harms to minors associated with online platforms;

(2) recommending measures and methods for assessing, preventing, and mitigating harms to minors online;

(3) recommending methods and themes for conducting research regarding online harms to minors, including in English and non-English languages; and

(4) recommending best practices and clear, consensus-based technical standards for transparency reports and audits, as required under this subtitle, including methods, criteria, and scope to promote overall accountability.

(d) Non-applicability of FACA.—The Kids Online Safety Council shall not be subject to chapter 10 of title 5, United States Code (commonly referred to as the “Federal Advisory Committee Act”).

SEC. 111. 112. Effective date.

Except as otherwise provided in this subtitle, this subtitle shall take effect on the date that is 18 months after the date of enactment of this Act.

SEC. 112. 113. Rules of construction and other matters.

(a) Relationship to other laws.—Nothing in this subtitle shall be construed to—

(1) preempt section 444 of the General Education Provisions Act (20 U.S.C. 1232g) , (commonly known as the “Family Educational Rights and Privacy Act of 1974”) or other Federal or State laws governing student privacy;

(2) preempt the Children's Online Privacy Protection Act of 1998 (15 U.S.C. 6501 et seq.) or any rule or regulation promulgated under such Act; or

(3) authorize any action that would conflict with section 18(h) of the Federal Trade Commission Act (15 U.S.C. 57a(h)).; or

(4) expand or limit the scope of section 230 of the Communications Act of 1934 (commonly known as “section 230 of the Communications Decency Act of 1996”) (47 U.S.C. 230).

(b) Determination of “fairly implied on the basis of objective circumstances”.—For purposes of enforcing this subtitle, in making a determination as to whether covered platform has knowledge fairly implied on the basis of objective circumstances that a specific user is a minor, the Federal Trade Commission or a State attorney general shall rely on competent and reliable evidence, taking into account the totality of the circumstances, including whether a reasonable and prudent person under the circumstances would have known that the user is a minor.

(bc) Protections for privacy.—Nothing in this subtitle, including a determination described in subsection (b), shall be construed to require—

(1) the affirmative collection of any personal data with respect to the age of users that a covered platform is not already collecting in the normal course of business; or

(2) a covered platform to implement an age gating or age verification functionality.

(d) Compliance.—Nothing in this subtitle shall be construed to restrict the a covered platform's ability of a covered platform to do the following:

(1) Ccooperate with law enforcement agencies regarding activity that the covered platform reasonably and in good faith believes may violate Federal, State, or local laws, rules, or regulation.s;

(2) Ccomply with a lawful civil, criminal, or regulatory inquiry, subpoena, or summons by Federal, State, local, or other government authority.ies; or

(3) Prevent, detect, protect against, or respond to any security incident, identity theft, fraud, harassment, malicious or deceptive activity, or any illegal activities.

(4) Investigate, report, or prosecute those responsible for any action described in paragraph (3).

(5 3) Iinvestigate, establish, exercise, respond to, or defend against legal claims.

(6) Preserve the integrity or security of the platform’s systems.

(d e) Application to video streaming services.—A video streaming service shall be deemed to be in compliance with this subtitle if

(1) the service it predominantly consists of news, sports, entertainment, or other video programming content that is preselected by the provider and that is not user-generated;, and—

(2) (3)(1) any chat, comment, or interactive functionality is provided incidental to, directly related to, or dependent on provision of such content; and

(4)(5)(2) if such in the case of a video streaming service that requires account owner registration and is not predominantly news or sports, the service includes the capability—

(A) to limit the a minor’s access of a minor to the service, which may use utilize a system of age-rating;

(B) to limit the automatic playing of on-demand content selected by a personalized recommendation system for an individual that the service knows is a minor;

(C) to provide an individual that the service knows is a minor with readily accessible and easy-to-use options to delete an account held by the minor on the service or, in the case of a service that allows a parent to create a profile for a minor, to allow a parent to delete the profile of the minor;

(D C) for a parent to manage the a minor’s privacy and account settings of a minor, and restrict purchases and financial transactions by a minor, where applicable;

(D) to provide an electronic point of contact specific to matters described in this paragraph;

(E) to provide offer a clear, conspicuous, and easy-to-understand notice of the its policies and practices of the service with respect to the capabilities described in this paragraph; and

(F) when providing on-demand content, to employ measures that safeguard against serving advertising for narcotic drugs (as defined in section 102 of the Controlled Substances Act (21 U.S.C. 802)), tobacco products, gambling, or alcohol directly to the account or profile of an individual that the service knows is a minor.

SEC. 113. Severability.

If any provision of this title, or an amendment made by this title, is determined to be unenforceable or invalid, the remaining provisions of this title and the amendments made by this title shall not be affected.

TITLE II subtitle B—Filter Bubble Transparency

SEC. 201. 120. Definitions.

In this subtitle:

(1) ALGORITHMIC RANKING SYSTEM.—The term “algorithmic ranking system” means a computational process, including one derived from algorithmic decision-making, machine learning, statistical analysis, or other data processing or artificial intelligence techniques, used to determine the selection, order, relative prioritization, or relative prominence of content from a set of information that is provided to a user on an online platform, including the ranking of search results, the provision of content recommendations, the display of social media posts, or any other method of automated content selection.

(2) APPROXIMATE GEOLOCATION INFORMATION.—The term “approximate geolocation information” means information that identifies the location of an individual, but with a precision of less than 5 miles.

(3) COMMISSION.—The term “Commission” means the Federal Trade Commission.

(4) CONNECTED DEVICE.—The term “connected device” means an electronic device that—

(A) is capable of connecting to the internet, either directly or indirectly through a network, to communicate information at the direction of an individual;

(B) has computer processing capabilities for collecting, sending, receiving, or analyzing data; and

(C) is primarily designed for or marketed to consumers.

(5) INPUT-TRANSPARENT ALGORITHM.—

(A) IN GENERAL.—The term “input-transparent algorithm” means an algorithmic ranking system that does not use the user-specific data of a user to determine the selection, order, relative prioritization, or relative prominence of information that is furnished to such user on an online platform, unless the user-specific data is expressly provided to the platform by the user for such purpose.

(B) DATA EXPRESSLY PROVIDED TO THE PLATFORM.—For purposes of subparagraph (A), user-specific data that is provided by a user for the express purpose of determining the selection, order, relative prioritization, or relative prominence of information that is furnished to such user on an online platform—

(i) includes shall include user-supplied search terms, filters, speech patterns (if provided for the purpose of enabling the platform to accept spoken input or selecting the language in which the user interacts with the platform), saved preferences, the resumption of a previous search, and the current precise geolocation information that is supplied by the user;

(ii) shall includes the user's current approximate geolocation information of the user;

(iii) shall includes data submitted to the platform by the user that expresses the user's desire of the user to receive particular information, such as the social media profiles the user follows, the video channels the user subscribes to, or other content or sources of content on the platform the user has selected;

(iv) does shall not include the history of the user's connected device of the user, including the user's history of web searches and browsing, previous geographical locations, physical activity, device interaction, and financial transactions of the user; and

(v) does shall not include inferences about the user or the user's connected device of the user, without regard to whether such inferences are based on data described in clause (i) or (iii).

(6) ONLINE PLATFORM.—The term “online platform”—

(A) means any public-facing website, online service, online application, or mobile application that predominantly provides a community forum for user-generated content;

(B) includes any, such as website, service, or application that shares ing videos, images, games, audio files, or other content, including a social media service, social network, or virtual reality environment; and .

(C) does not include—

(i) chats, comments, or other interactive functionalities of the community forum that is incidental to the predominant purpose of the website, online service, online application, or mobile application; or

(ii) a product or service that primarily serves to facilitate the sale or provision of commercial products or professional services.

(7) OPAQUE ALGORITHM.—The term “opaque algorithm”

(A) IN GENERAL.—The term “opaque algorithm” means an algorithmic ranking system that determines the selection, order, relative prioritization, or relative prominence of information that is furnished to such user on an online platform based, in whole or part, on user-specific data that was not expressly provided by the user to the platform for such purpose and.

(B) does EXCEPTION FOR AGE-APPROPRIATE CONTENT FILTERS.—Such term shall not include an algorithmic ranking system used by an online platform if—

(i) the only user-specific data (including inferences about the user) that the system uses is information relating to the age of the user; and

(ii) such information is only used to restrict the a user's access of a user to content on the basis that the individual is not old enough to access such content.

(8) PRECISE GEOLOCATION INFORMATION.—The term “precise geolocation information” means geolocation information that identifies the an individual’s location of an individual to within a range of 5 miles or less.

(9) USER-SPECIFIC DATA.—The term “user-specific data” means information relating to an individual or a specific connected device that would not necessarily be true of every individual or device.

SEC. 202.121. Requirement to allow users to see unmanipulated content on internet platforms.

(a) In general.—Beginning on the date that is 1 year after the date of enactment of this Act, it shall be unlawful for any person to operate an online platform that uses an opaque algorithm that does not meet the following requirements: unless the person complies with the requirements of subsection (b).

(b) Opaque algorithm requirements.—

(1) Notice Required IN GENERAL.—Except as provided in paragraph (4), the online platform provides notice to users of the platform—

The requirements of this subsection with respect to a person that operates an online platform that uses an opaque algorithm are the following:

(A) The person provides users of the platform with the following notices:

(i) Notice that the platform uses an opaque algorithm that uses user-specific data to select the content the user sees,. Such notice shall be presented in a clear and conspicuous manner on the platform whenever the user interacts with an opaque algorithm for the first time, and may be a one-time notice that can be dismissed by the user; and.

(B)

(ii) Notice, to be included in the terms and conditions of the online platform, in a clear, accessible, and easily comprehensible manner that is to be updated whenever the online platform makes a material change, to of

(i I) the most salient features, inputs, and parameters used by the opaque algorithm;

(ii II) how any user-specific data used by the opaque algorithm is collected or inferred about a user of the platform, and the categories of such data;

(iii III) any options that the online platform makes available for a user of the platform to opt out or exercise options under subparagraph (2 B), modify the profile of the user or to influence the features, inputs, or parameters used by the opaque algorithm; and

(iv IV) any quantities, such as time spent using a product or specific measures of engagement or social interaction, that the opaque algorithm is designed to optimize, as well as a general description of the relative importance of each quantity for such ranking.

(B) The online platform enables users to easily switch between the opaque algorithm and an input-transparent algorithm in their use of the platform.

(2) Opt-Out.—Except as provided in paragraph (4), the online platform enables users to easily switch between the opaque algorithm and an input-transparent algorithm in their use of the platform.

(3) PROHIBITION ON DIFFERENTIAL PRICING.—An online platform shall not deny, charge different prices or rates for, or condition the provision of a service or product to a user based on the user’s election to use an input-transparent algorithm in their use of the platform, as provided under paragraph (1)(B).

(4) Exception.—Notwithstanding paragraphs (1) and (2), the online platform shall provide the notice and opt-out described in paragraphs (1) and (2) to the educational agency or institution (as defined in section 444(a)(3) of the General Education Provisions Act (20 U.S.C. 1232g(a)(3)), rather than to the user, when the online platform is acting on behalf of an educational agency or institution (as so defined), subject to a written contract that complies with the requirements of the Children’s Online Privacy Protection Act of 1998 (15 U.S.C. 1232g(a)(3)) and section 444 of the General Education Provisions Act (20 U.S.C. 1232g) (commonly known as the “Family Educational Rights and Privacy Act of 1974”).

(b) RULE OF CONSTRUCTION.—Nothing in this subsection shall be construed to require an online platform to disclose any information, including data or algorithms—

(A) relating to a trade secret or other protected intellectual property;

(B) that is privileged, proprietary, or confidential commercial business information; or

(C) that is privileged.

(3) PROHIBITION ON DIFFERENTIAL PRICING.—An online platform shall not deny, charge different prices or rates for, or condition the provision of a service or product to a user based on the user’s election to use an input-transparent algorithm in their use of the platform, as provided under paragraph (1)(B).

(c) Enforcement by Federal Trade Commission.—

(1) UNFAIR OR DECEPTIVE ACTS OR PRACTICES.—A violation of this section by an operator of an online platform shall be treated as a violation of a rule defining an unfair or deceptive act or practice prescribed under section 18(a)(1)(B) of the Federal Trade Commission Act (15 U.S.C. 57a(a)(1)(B)).

(2) POWERS OF COMMISSION.—

(A) IN GENERAL.—The Federal Trade Commission shall enforce this section in the same manner, by the same means, and with the same jurisdiction, powers, and duties as though all applicable terms and provisions of the Federal Trade Commission Act (15 U.S.C. 41 et seq.) were incorporated into and made a part of this section.

(B) PRIVILEGES AND IMMUNITIES.—Any online platform that person who violates this title section shall be subject to the penalties and entitled to the privileges and immunities provided in the Federal Trade Commission Act (15 U.S.C. 41 et seq.).

(C) AUTHORITY PRESERVED.—Nothing in this section shall be construed to limit the authority of the Commission under any other provision of law.

(d) Rule of construction to preserve personalized blocks.—Nothing in this section shall be construed to limit or prohibit the ability of an online platform’s ability to, at the direction of an individual user or group of users, restrict another user from searching for, finding, accessing, or interacting with the  such user’s or group’s account, content, data, or online community of the user or groups of users.

SEC. 203. Severability.

If any provision of this title, or an amendment made by this title, is determined to be unenforceable or invalid, the remaining provisions of this title and the amendments made by this title shall not be affected.

Title III —Relationship to State Llaws; severability

SEC. 301. 130. Relationship to State laws.

The provisions of this Act title shall preempt any State law, rule, or regulation only to the extent that such State law, rule, or regulation conflicts with a provision of this Act title. Nothing in this Act title shall be construed to prohibit a State from enacting a law, rule, or regulation that provides greater protection to minors than the protection provided by the provisions of this title.

SEC. 131. Severability.

If any provision of this title, or an amendment made by this title, is determined to be unenforceable or invalid, the remaining provisions of this title and the amendments made by this title shall not be affected.