Authored by PIPC Staff

Student Privacy Primer

Student Privacy Primer Juliana Cotto, Edith Mandinach, Amelia Vance, Jim Siegl, Anisha Reddy, Tyler Park, and Jasmine Parks This primer explains the concepts of student data, including who uses the data and why they use it; data privacy in general; student data privacy; student data privacy risks and harms; how student data privacy relates to data ethics and data equity; key federal privacy laws; key district and school policies; and what it means to foster a culture of privacy. Each of these sections and a concluding section list additional resources to help education stakeholders learn more about student data privacy. […]

Student Privacy Primer Read More »

Recommendations for School Districts

Part IV: Recommendations How to Reduce Risks, Ensure Equity, and Protect Student Privacy when Implementing Self-Harm Monitoring Programs As schools and districts attempt to protect students amid the strains of the pandemic on student well-being, education stakeholders should remember that privacy protections can enhance mental health support programs by encouraging students to feel they can safely ask adults for help because they know that the information shared will remain confidential. Before adopting monitoring technology, schools and districts should understand key facts about how the technology works, how its implementation may impact students with mental health needs or disabilities, and how

Recommendations for School Districts Read More »

Legal Considerations

Part III: Legal Considerations Legal Considerations for School Districts In addition to understanding privacy and equity impacts, schools should be aware of important legal implications associated with adopting monitoring technologies and collecting student information related to mental health and potential to self-harm. In addition to CIPA (described here), there are several federal laws and protections that may influence how school districts can implement self-harm monitoring programs, manage the student information collected through such programs, and interact with students identified through self-harm monitoring. Additional state laws may apply as well. Schools should be aware of federal and state regulations that may

Legal Considerations Read More »

Privacy and Equity Concerns: Questions 6 & 7

Part II: Privacy and Equity Concerns 6. How is student information shared with third parties, if at all, and are such disclosures permitted by law? Another key privacy consideration is whether and how schools share student information collected from monitoring programs, including individual students’ flagged status, with third parties, such as law enforcement entities, hospitals, or social services providers. School districts may be inclined to share a student’s flagged status or mental health information with law enforcement because of biases and misconceptions that conflate mental health problems with violence, or even because of a lack of school-based mental health resources

Privacy and Equity Concerns: Questions 6 & 7 Read More »

Privacy and Equity Concerns: Questions 4 & 5

Part II: Privacy and Equity Concerns 4. What harms, such as stigma or discrimination, may stem from sharing of students’ information or flagged status? Using self-harm monitoring systems raises potential risks of stigma or discrimination. Biases embedded in public perception and media lead to exaggerated fears that students experiencing mental health challenges are prone to violent acts,65 even though most people with mental health needs have no propensity for violence.66 As a result of such biases, school staff may treat flagged students differently from their peers or subject them to additional scrutiny. The common but false assumption that flagged students

Privacy and Equity Concerns: Questions 4 & 5 Read More »

Privacy and Equity Concerns: Questions 1-3

Part II: Privacy and Equity Concerns Privacy and Equity Concerns Raised by Self-Harm Monitoring Technology Before adopting self-harm monitoring technology, schools and districts should understand the risks self-harm monitoring technology can pose to students’ privacy and safety and carefully weigh those risks against any benefits. Schools have widely and rapidly adopted self-harm monitoring technologies, despite the fact that they are relatively new and unstudied.49 Over the past two years, adoption increased as concerns grew about students struggling with mental health during the COVID-19 pandemic.50 These facts raise important questions about the privacy risks and implications of monitoring that schools must

Privacy and Equity Concerns: Questions 1-3 Read More »

What Is Self-Harm Monitoring Technology and How Do Schools Use It

Part I: Background What Is Self-Harm Monitoring Technology and How Do Schools Use It? Schools often adopt self-harm monitoring technology with the best intentions: to help keep students safe and improve their well-being. However, if implemented without due consideration to the significant privacy and equity risks posed to students, these programs may harm the very students that need the most support or protection, while ineffectively fulfilling their intended purpose of preventing self-harm. Before adopting self-harm monitoring technology, schools and districts should understand the risks self-harm monitoring technology can pose to students’ privacy and safety, take thoughtful steps to mitigate those

What Is Self-Harm Monitoring Technology and How Do Schools Use It Read More »

Teaching privacy and ethical guardrails for the AI imperative in education

Teaching Privacy and Ethical Guardrails for The AI Imperative in Education Evan Selinger & Amelia Vance Originally published by the NSW Department of Education Future EDge, Issue 3 December 2020 Introduction In 1956 computer scientist John McCarthy coined the phrase ‘artificial intelligence’ (AI) to describe ‘the science and engineering of making intelligent machines’ (McCarthy, 2007). Over time, the term has evolved to cover a variety of technologies, including ones widely used in education, from plagiarism detectors to voice-activated virtual assistants leveraged to enhance campus information distribution and classroom pedagogy (Arizona State University, 2018). Contemporary AI discussions are about ‘a variety

Teaching privacy and ethical guardrails for the AI imperative in education Read More »

Student Privacy and Special Education: An Educator’s Guide During and After COVID-19

Student Privacy and Special Education: An Educator’s Guide During and After COVID-19 August 04, 2020 Lindsey Barrett and Amelia Vance National Center for Learning Disabilities & Future of Privacy Forum   Shared Under Creative Commons License COVID-19 has disrupted education and has forced schools to pivot quickly to a distance learning approach, which is often virtual. Using virtual learning products comes with concerns about student privacy, including for students receiving special education and related services. Federal privacy laws don’t explicitly address how to handle every situation, but concerns about privacy should not be a barrier to serving students as best

Student Privacy and Special Education: An Educator’s Guide During and After COVID-19 Read More »

Student Privacy’s History of Unintended Consequences: Lessons Learned and Key Principles for Student Privacy

IV. LESSONS LEARNED AND KEY PRINCIPLES FOR STUDENT PRIVACY Previously published in the Seton Hall Journal of Legislation and Public Policy IV. LESSONS LEARNED AND KEY PRINCIPLES FOR STUDENT PRIVACY The unintended effects discussed above are emblematic of the challenges that privacy legislation has posed in the last decade, echoing many issues that arose when the U.S. Congress passed FERPA in 1974. For legislators, this history offers more than a cautionary tale; it suggests specific lessons and principles that policymakers can use to change the trajectory of future privacy legislation. For example, some of the student privacy laws were passed hastily

Student Privacy’s History of Unintended Consequences: Lessons Learned and Key Principles for Student Privacy Read More »