The Week in Child & Student Privacy

April 1, 2022

What I'm Reading

Cross-posted from Public Interest Privacy Consulting LLC Blog

What's Happening

This bill "says that any social media platform with more than 1 million account holders (and operating in Minnesota) cannot use an algorithm to recommend content to users under the age of 18." As the Chamber of Progress CEO points out, "YouTube Kids uses algorithms and manual curation to surface content appropriate for children and Twitter's algorithms to help prioritize users find relevant content" - so, if this bill passes, this means that all of the requests for companies to use their algorithms to make a safer internet for kids would be banned in Minnesota.

Why You Should Care

This bill undermines child wellbeing and privacy instead of protecting it, and passing it would set a dangerous precedent for child privacy protections in other states.

What's Happening

California is considering its own version of the Age-Appropriate Design Code (AADC), which was extensively discussed at the informational hearing I testified at on Tuesday (I recommend watching the parts of the hearing with me, obviously, but especially the panel starting with Baroness Kidron at 1:49:36). I think trying out some version of the AADC in the U.S. makes sense - it provides underlying, unwaivable privacy protections for kids, eliminating the need for a consent-centric framework that doesn't work. However, there are issues: U.S. parents don't want government to override their decision on what content or technology is inappropriate for their children to access, and there are serious First Amendment limitations. Jessica Rich, former director of the FTC's Bureau of Consumer Protection, discusses some of the other potential issues in this blog.

Why You Should Care

Like it or not, the AADC is coming to a theater near you. The UK ICO started enforcing the Code as of September 2021, and we're seeing massive changes made by big tech companies to come into compliance. Importantly, AADC doesn't just apply to the narrow "actual knowledge" standard of COPPA; it also applies to services likely to be accessed by children. Unless your site is just for adults, you might be swept into its requirements.

What's Happening

Senators Warren and Markey released a report on the responses they received from student monitoring companies in 2021. The report finds that:

  1. Student activity monitoring software may be misused for disciplinary purposes and result in increased contact with law enforcement;
  2. Software companies have not taken any steps to determine whether student activity monitoring software disproportionately targets students from marginalized groups, leaving schools in the dark;
  3. Schools, parents, and communities are not being appropriately informed of the use; and potential misuse of the data; and
  4. Regulatory and legal gaps exacerbate the risks of student activity monitoring software.

Why You Should Care

It recommends that the FCC issue guidance on CIPA (a law that needs to be followed for schools to get e-rate funding) to clarify exactly what schools need to do to monitor their students; the Dept of Ed should require local education agencies to track potential impacts of these tools on protected classes; and monitoring companies should use de-identified demographic data to examine the impact of their algorithms on protected classes and share the results.

Why You Should Care: As many of you know, I've been writing about student monitoring since 2016, and it is personally important to me. The recommendations in this report are pretty good and recognize the many nuances involved in this issue. However, I am concerned that any clarification by the FCC about monitoring would end up requiring more monitoring (see Sen Cornyn's bill advocating for much more monitoring in the name of school safety).

The report generally emphasizes the use of algorithms in edtech that could harm protected classes. Companies should keep an eye on whether the recommendations in this report spread beyond student monitoring software. As seen with Minnesota's bill, algorithms are not well understood, especially related to kids, and will likely continue to pop up.

What's Happening

Child privacy is actively being discussed in DC after the SOTU mention of child privacy (which was included in the President's mental health initiatives). The Kids Online Safety Act (not to be confused with the KIDS Act) "would require social media platforms put the interests of children first by requiring platforms to make safety the default and to give kids and parents tools to help prevent the destructive impact of social media" and "ensures that parents and policymakers can assess whether social media platforms are taking meaningful steps to address risks to kids." While it is important to update U.S. child privacy laws, this privacy bill has some serious privacy-reducing implications, as EFF points out in this article.

Why You Should Care

Privacy as the solution to teen mental health challenges is a growing trend. There are correlations (and causative factors), especially when we see dark patterns being used. However, to truly protect child privacy, we have to think less about the internet as "tobacco" that should be banned for kids and more like "car safety" that needs to be regulated.

What's Happening

The FTC and DOJ announced a settlement agreement with WW International after "charging them with violating COPPA for improperly collecting health information and other data from children as young as eight years old. The settlement requires the deletion of all affected work product - which includes algorithms - resulting from the companies collection of children's data." This is the third FTC settlement to require algorithmic disgorgement, but the first one in the COPPA context.

Why You Should Care

Requiring that algorithms derived from (potentially) ill-gotten gains be deleted is a pretty great deterrent for companies to make sure they are compliant with Section 5 and/or COPPA, and it seems likely that the FTC will continue to require this as part of settlements moving forward. However, as this article discusses, deleting algorithms may be more complicated than it sounds.

What's Happening

Parental rights legislation is being discussed all over the country (and federally), often requiring that "parents to be notified when their children borrow items or that block a student's ability to check out specified materials at their parent's request." Follett, which runs the library management system used by most K-12 school libraries, is creating an "optional functionality that can meet these new requirements." As you might imagine, privacy-protective librarians are up in arms.

Why You Should Care

Too often, policymakers create laws for an image of a stereotypical family, ignoring that, unfortunately, not all parents have their child's best interests at heart. I'm scared for the LGBTQIA kids; heartbreakingly, getting thrown out of home didn't end with Obergefell. When I first realized I liked boys and girls, I immediately went to the library (yes, I know I'm a geek) and looked for community in young adult books with kids like me who grew up and were happy. This was vital to my mental health. Despite my parents always being supportive in other areas of my life, I did not want them to know until I was ready (and felt safe enough) to tell them.

I'm also worried about abused kids who might not be able to access resources. And I'm worried about the child with different educational interests than their family - children's book character Matilda comes to mind - who might never be able to grow into the person they were meant to be because of these laws. This fight isn't just happening in state legislatures; school districts are having to make a call on how to respond to these requests, and are often brutally maligned when they make a decision that protects kids. Pay attention to what is happening in your community, and be a voice for supporting the kids who most need access to information and community.

Resources Worth Your Time

What's Happening

As discussed above, policymakers everywhere are focusing on child privacy online. However, this isn't new: many of the ideas being discussed today were passed into law in the 90s and found to be unconstitutional. This new Congressional Research Service report gives a great overview of child privacy and legal concepts implicated in the content-based regulation of expression under the First Amendment.

Why You Should Care

If you are interested in child privacy policymaking, you MUST read this report to understand what the U.S. can and cannot constitutionally adopt.

What's Happening

The Surveillance Technology Oversight Project (S.T.O.P.) has launched "an education campaign detailing the dangers and responses to smart city technology" to help "activists, lawmakers, and members of the public identify common risks from expanding municipal technology in policing, education, and transit."

Why You Should Care

S.T.O.P. has become a bigger and bigger player in the student privacy space over the past 3+ years, and this toolkit has the potential to be the default talking points used by activists and policymakers moving forward.

Tweets I Enjoyed This Week

h/t to @hypervisible for both!

Tweet by Sasha Costanza-Chock: In this paper, I will demonstrate how the Encanto is actually a surveillance state where the benevolent dictator Abuela deploys her kin as proxies to mete out displays of absolute force, magico-medical health care, total audio surveillance, & advanced infiltration tech (1/237)

For those interested in promoting children's social and emotional development: ROCK - right-wingers attack SEL, claiming it's linked to critical race theory: https://nbcnews.to/36x5pdT HARD PLACE - Big Tech turns SEL into student-monitoring profit source: https://www.nationofchange.org/2022/03/19/how-big-tech-sees-big-profits-in-social-emotional-learning-at-school/