The Week in Child & Student Privacy

April 11, 2022

What I'm Reading

Cross-posted from Public Interest Privacy Consulting LLC Blog

What's Happening

Two companies announced school partnerships to "tackle student mental health," one of them by incorporating "information about a student's social-emotional health and well-being into their college preparation, potentially helping schools give students more personalized attention in their post-graduation plans."

Why You Should Care

This Government Technology article reads like a company press release, with no discussion of the massive privacy implications. Collecting student mental health information without embedding privacy (and recognizing the role of privacy here by, for example, mentioning it in your press release) is likely to disproportionately harm already vulnerable student groups and cause understandable backlash from communities who think schools are overreaching by collecting this data in the first place. There are serious mental health issues that schools need to address, but tech is only a tool, not a solution. It is irresponsible to write about these tools without asking questions about privacy.

What's Happening

In December 2021, a Markup investigation found that "Popular Family Safety App Life360 Is Selling Precise Location Data" on its 33 million users (aka children whose parents use the app to track their movements). Fortunately, this fantastic journalism made the company stop selling precise location data… to all but one data broker. This article reports that U.S. regulators are now asking questions as well.

Why You Should Care

Too often, technology meant to surveil children for their own good - especially tech that is sold to parents - is found to have questionable privacy and security practices. Perhaps that should give parents, schools, and policymakers pause before adopting this tech.

What's Happening

The House and Senate recently passed the College Transparency Act (CTA) and it is heading to conference. This bill, which has numerous privacy protections, overturns the current federal ban on connecting education data collected by the federal government in order to provide students, post-secondary institutions, and the public with information that could be used to improve policies or better target federal funding. For more info, check out the blog I wrote about CTA back in 2017.

Why You Should Care

This article starts by saying that schools collect "detailed data on children" (despite this bill only covering postsecondary student data) and adds an Orwell reference. Readers could be forgiven for being concerned about CTA since the article skates over the numerous privacy protections in the bill and its narrow scope. For example, there will not be new data collection - this bill simply allows federal agencies to share the data they already have to give students the information they need to make informed college choices. There are plenty of student privacy issues that advocates should be focused on. This isn't one of them.

"After backlash, an educational services company paused a rollout that would prevent kids from borrowing books under certain tags like (hypothetically) 'LGBT.'"

Worth Watching

What's Happening

The Markup held a webinar on March 29 focused on student privacy.

Why You Should Care

If you care about these issues and don't know what The Markup is, you should start following them asap. Their journalists have conducted a number of student privacy-related investigations, and they put out a call in January asking readers to "Help Us Investigate the Ed Tech Industry."

What's Happening

The Council of Europe held a recorded conference on "A New Era for the Rights of the Child," which launched the new Strategy for the Rights of the Child (2022-2027).

Why You Should Care

If you want to keep up with the EU discussions on child privacy, have a junior staffer (or you, if you're a child privacy nerd like myself) take notes on this event.

Worth Listening

Worth Reading

We were blessed with an overabundance of great student privacy scholarship over the past week!

TLDR

"Like its counterpart in the criminal justice system, dirty data-data that is inaccurate, incomplete, or misleading-in K-12 education records creates and catalyzes catastrophic life events. Dirty data created, collected, and processed as accurate and reliable, notwithstanding the disproportionate impact of school discipline, on marginalized students in general, and Black children specifically, is exactly the kind of harm that FERPA was intended to prevent."

TLDR

Digital literacy efforts to date have focused on supporting young people to become better users of technology, but the most influential and insidious digital technologies over the next few years are likely to be technologies that are 'used on' people: data-driven automated technologies. This calls for rethinking what might have been previously talked about as 'digital literacy' as a form of 'algorithmic literacy.'

TLDR

The review of 30 empirical articles shows that negative online experiences undermine young people's well-being but are also essential to developing online resilience.

TLDR

This article maps the claims, grounds and warrants of technological solutions to maintaining student data privacy in learning analytics. Our findings suggest that many technological solutions are based on assumptions, such as that individuals have control over their data which can be exchanged under agreed conditions, or that individuals embrace their personal data privacy as a human right to be respected and protected. We consider alternative approaches to viewing (student) data privacy, such as contextual integrity; data privacy as ontological; group privacy; and indigenous understandings of privacy.

TLDR

Universities are developing learning analytics initiatives that include academic library participation, but libraries rarely inform their students about learning analytics projects or general library data practices. Findings demonstrate that students considered librarian access to and sharing of personally identifiable information to constitute a privacy violation but also lacked awareness of the data and analytic practices on which libraries rely.

TLDR

Online learning during COVID-19 was often put in place without adequate consideration of the data protection risks from various online learning tools. Although GDPR provides a framework of regulations and rights to protect users, the legal process is unwieldy to apply due to tensions in balancing the rights of the child learner with the public need to ensure that all children are provided with an education. This paper recommends that changes in digital schooling practices are needed so that children have realistically possible ways of enforcing their data protection rights as well as a clarified and uniformed approach to support schools.