From Data Privacy to Discrimination:

Examining the Legal Ramifications of AI in Schools

April 2024

Morgan Sexton and Amelia Vance

 

 

 

CC BY-NC 4.0

Introduction

The incorporation of artificial intelligence (AI) seems to be everywhere at the moment - and schools are no different! While using AI to improve systems and educational outcomes for students is an exciting prospect, it should be done carefully and with consideration for the legal landscape. AI in education is subject to a myriad of education, child privacy, consumer, and civil rights laws. We've created this brief to provide an overview of the laws that educators and administrators should be considering before integrating AI powered tools.

Federal

Many existing federal laws will impact how AI can be used in education, including laws that were passed long before modern technology was widely incorporated into classroom instruction. The following are some examples of major federal laws and how they may apply to AI in education.

Family Educational Rights and Privacy Act (FERPA)

FERPA requires schools to protect the privacy of personally identifiable information (PII) in education records and give parents (and eligible students) certain rights to this information. The amount of PII in education records increases with more technology in schools, especially with AI systems that are trained on and continually using student data. FERPA’s high standard for de-identification can limit the type and amount of student PII that can be used in AI systems, since the risk of being able to re-identify students grows as more information is included and AI systems may reveal sensitive student information. Additionally, it can be difficult to determine whether AI tools protect PII in accordance with FERPA and to facilitate the exercising of parents and eligible students’ FERPA rights.

TIP: Consult with a vendor’s privacy policies or other statements about a product to better understand how they may protect student PII. 

The Elementary and Secondary Education Act (ESEA) and the Every Student Succeeds Act (ESSA)

ESEA, the primary law governing K-12 education, directs educators to implement interventions that are grounded in research. ESSA, the latest reauthorization of ESEA, requires “evidence-based interventions.” Interventions applied under ESSA (including technologies adopted with AI) are required to meet ESSA’s tiers of evidence, which is usually accomplished through formal studies and research.

 

Individuals with Disabilities in Education Act (IDEA)

IDEA ensures that students with disabilities have access to a free and appropriate public education. IDEA includes specific privacy and nondiscrimination requirements, which can be implicated by the use of AI in schools. AI tools can be used as part of a student’s individualized education plan to provide the student with educational services that are customized to support their needs. 

10
Protection of Pupil Rights Amendment (PPRA)

PPRA requires schools to give parents an opportunity to opt-in or opt-out of certain data collection from students involving eight protected topics. AI tools that ask for (or otherwise enable students to input) information related to the eight protected topics may implicate parental consent rights under PPRA. Additionally, PPRA can be implicated when AI tools are used in the administration of surveys to students. 

Title VI of the Civil Rights Act of 1964 (Title 6)

Title 6 prohibits discrimination on the basis of race, national origin, or color in educational programs and activities. AI tools used in schools may perpetuate race-based discrimination when the systems are trained on undiverse datasets or reflect the biases of the system’s creator or historical racial inequalities. Using such tools can further promote systemic racism and inequalities in education.

Title IX of the Education Amendments of 1972 (Title 9)

Title 9 prohibits discrimination on the basis of sex in educational programs and activities. AI tools used in schools may perpetuate sex-based discrimination when the systems are trained on unrepresentative datasets or reflect gender stereotypes or the biases of the system’s creator. This can be harmful to students using such AI tools, especially students whose sexual orientations or gender identities are not represented in the dataset.

The Americans with Disabilities Act (ADA)

Title II of the ADA prohibits disability discrimination by all public entities, including educational agencies and institutions. Assistive technologies with AI functionalities can help students with disabilities access educational experiences. Disability discrimination can result when schools implement overly-broad bans on the use of AI or require the use of AI tools that are not designed to be accessible or inclusive for those with disabilities.

25

Section 504 of the Rehabilitation Act (Section 504)

Section 504 prohibits disability discrimination in educational programs and activities. Similar to the ADA (discussed above), assistive technologies with AI functionalities can help students with disabilities gain access to educational experiences. Disability discrimination can result when schools implement overly-broad bans on the use of AI or require the use of AI tools that are not designed to be accessible or inclusive for those with disabilities.

Children’s Online Privacy Protection Act (COPPA)

COPPA requires companies to obtain verifiable consent from parents (or schools, where applicable) before collecting or disclosing data from children under 13. COPPA does not regulate schools directly, but often regulates school technology providers (including edtech companies) that collect data from students under 13. COPPA is implicated whenever AI tools used in schools collect information from children and may limit how children’s data can be used to train AI tools.

Children’s Internet Protection Act (CIPA)

CIPA requires entities that receive E-Rate funding to engage in internet filtering and monitor internet content. Schools often use AI tools to facilitate this online content filtering and student monitoring. 

Section 5 of the Federal Trade Commission Act (FTC Act)

Section 5 of the FTC Act prohibits unfair or deceptive trade practices and empowers the Federal Trade Commission (FTC) to bring enforcement action against companies who violate the act. The FTC has repeatedly used this authority to regulate companies’ (including edtech providers) implementation of AI tools and data governance practices.

State

Unlike at the federal level where laws are often slower to adapt to technological advancements, state policymakers have more flexibility to address potential concerns surrounding AI in a more timely manner. Many states have taken significant measures to regulate AI, such as forming specialized task forces, issuing executive orders, and publishing guidance. 

In addition to laws specifically focused on AI in education, a wide variety of state laws can impact the use of AI in education, including:

AI and Government Laws 

Several states have passed laws regulating how governmental entities can use AI tools. Public schools are governmental entities, which means the use of AI in education may be regulated under these laws unless there is a specific exception for schools and their service providers. 

2023-12-12 Canva Adapted Image (32)

State Student Privacy Laws

States have passed almost 150 student privacy laws since 2014. These laws tend to fall into two broad categories: laws regulating educational institutions and laws regulating vendors. Both types of state student privacy laws may regulate how AI tools are permitted to be used in classrooms, as well as how student data can be used to train algorithms.

State Consumer Privacy Laws

16 states* have passed comprehensive consumer privacy laws as of April 2024, some of which regulate nonprofits. These laws may give consumers certain rights when AI tools are used, including the right to opt-out of automated decision-making, and may provide heightened protections for children.

Record Retention and Management Laws

Every state has a law related to the retention and management of education records. These laws may be implicated when student information is used to train algorithms or is otherwise input into AI tools. 

State Child Privacy Laws

Several states have passed laws that create or expand privacy rights for minors. One example is children’s online safety bills that require technology providers to protect children from potential harms on their platforms, including through prohibiting the use of AI to increase children’s engagement on the platform.

Students at Sutton Middle School use online research to answer questions during a lesson in history class.

State Unfair and Deceptive Practices Laws

Every state has a law prohibiting unfair and deceptive practices that often mirrors and builds upon Section 5 of the FTC Act (discussed above), some of which regulate nonprofits. Many state unfair and deceptive practices laws govern the use of AI tools.

Laws Restricting Specific Technologies

Several states have passed laws restricting or prohibiting the use of emerging technologies that incorporate certain AI-powered functionalities. This may include regulating the use of biometric and facial recognition systems (such as New York’s moratorium on the use of facial recognition technologies in schools), personalized recommendation and content filtering algorithms, automated decision-making with material legal implications, and many others. State laws imposing technology-specific restrictions may impact how different AI-powered technologies can be utilized in education.

*16 states have passed comprehensive consumer privacy laws since 2018, most of which have been passed in the last 2-3 years. This includes: CA, CO, CT, DE, FLIA, IN, KY, MT, NH, NJ, OR, TN, TX, UT, and VA. (Note: Florida's law is narrower than others.)

Local Entities

Local government and other entities can also play a key role in regulating AI in education. Private and independent schools may have their own AI policies. Be sure to check if your local entities have passed their own AI guidance or policies, including your city or county–some have banned governmental entities from using AI-based technologies like facial recognition–and your local school board.

Closing Thoughts

While there are many potential benefits to incorporating AI in education, schools must also be mindful of the legal implications that come with its use. From safeguarding student privacy to addressing biases and discrimination, schools are responsible for adhering to a variety of complex federal, state, and local laws and regulations when implementing AI. Our aim is to shed light on the legal landscape surrounding the use of AI in schools as it is rapidly changing, equipping educators and administrators with the baseline knowledge they need to identify the relevant laws.