April 11, 2022
What I'm Reading
Cross-posted from Public Interest Privacy Consulting LLC Blog
Two companies announced school partnerships to "tackle student mental health," one of them by incorporating "information about a student's social-emotional health and well-being into their college preparation, potentially helping schools give students more personalized attention in their post-graduation plans."
This Government Technology article reads like a company press release, with no discussion of the massive privacy implications. Collecting student mental health information without embedding privacy (and recognizing the role of privacy here by, for example, mentioning it in your press release) is likely to disproportionately harm already vulnerable student groups and cause understandable backlash from communities who think schools are overreaching by collecting this data in the first place. There are serious mental health issues that schools need to address, but tech is only a tool, not a solution. It is irresponsible to write about these tools without asking questions about privacy.
In December 2021, a Markup investigation found that "Popular Family Safety App Life360 Is Selling Precise Location Data" on its 33 million users (aka children whose parents use the app to track their movements). Fortunately, this fantastic journalism made the company stop selling precise location data… to all but one data broker. This article reports that U.S. regulators are now asking questions as well.
Too often, technology meant to surveil children for their own good - especially tech that is sold to parents - is found to have questionable privacy and security practices. Perhaps that should give parents, schools, and policymakers pause before adopting this tech.
"After backlash, an educational services company paused a rollout that would prevent kids from borrowing books under certain tags like (hypothetically) 'LGBT.'"
Worth Watching
The Markup held a webinar on March 29 focused on student privacy.
If you care about these issues and don't know what The Markup is, you should start following them asap. Their journalists have conducted a number of student privacy-related investigations, and they put out a call in January asking readers to "Help Us Investigate the Ed Tech Industry."
The Council of Europe held a recorded conference on "A New Era for the Rights of the Child," which launched the new Strategy for the Rights of the Child (2022-2027).
If you want to keep up with the EU discussions on child privacy, have a junior staffer (or you, if you're a child privacy nerd like myself) take notes on this event.
Worth Reading
We were blessed with an overabundance of great student privacy scholarship over the past week!
"Like its counterpart in the criminal justice system, dirty data-data that is inaccurate, incomplete, or misleading-in K-12 education records creates and catalyzes catastrophic life events. Dirty data created, collected, and processed as accurate and reliable, notwithstanding the disproportionate impact of school discipline, on marginalized students in general, and Black children specifically, is exactly the kind of harm that FERPA was intended to prevent."
Digital literacy efforts to date have focused on supporting young people to become better users of technology, but the most influential and insidious digital technologies over the next few years are likely to be technologies that are 'used on' people: data-driven automated technologies. This calls for rethinking what might have been previously talked about as 'digital literacy' as a form of 'algorithmic literacy.'
The review of 30 empirical articles shows that negative online experiences undermine young people's well-being but are also essential to developing online resilience.
This article maps the claims, grounds and warrants of technological solutions to maintaining student data privacy in learning analytics. Our findings suggest that many technological solutions are based on assumptions, such as that individuals have control over their data which can be exchanged under agreed conditions, or that individuals embrace their personal data privacy as a human right to be respected and protected. We consider alternative approaches to viewing (student) data privacy, such as contextual integrity; data privacy as ontological; group privacy; and indigenous understandings of privacy.
Universities are developing learning analytics initiatives that include academic library participation, but libraries rarely inform their students about learning analytics projects or general library data practices. Findings demonstrate that students considered librarian access to and sharing of personally identifiable information to constitute a privacy violation but also lacked awareness of the data and analytic practices on which libraries rely.
Online learning during COVID-19 was often put in place without adequate consideration of the data protection risks from various online learning tools. Although GDPR provides a framework of regulations and rights to protect users, the legal process is unwieldy to apply due to tensions in balancing the rights of the child learner with the public need to ensure that all children are provided with an education. This paper recommends that changes in digital schooling practices are needed so that children have realistically possible ways of enforcing their data protection rights as well as a clarified and uniformed approach to support schools.