April 1, 2022
What I'm Reading
Cross-posted from Public Interest Privacy Consulting LLC Blog
This bill "says that any social media platform with more than 1 million account holders (and operating in Minnesota) cannot use an algorithm to recommend content to users under the age of 18." As the Chamber of Progress CEO points out, "YouTube Kids uses algorithms and manual curation to surface content appropriate for children and Twitter's algorithms to help prioritize users find relevant content" - so, if this bill passes, this means that all of the requests for companies to use their algorithms to make a safer internet for kids would be banned in Minnesota.
This bill undermines child wellbeing and privacy instead of protecting it, and passing it would set a dangerous precedent for child privacy protections in other states.
California is considering its own version of the Age-Appropriate Design Code (AADC), which was extensively discussed at the informational hearing I testified at on Tuesday (I recommend watching the parts of the hearing with me, obviously, but especially the panel starting with Baroness Kidron at 1:49:36). I think trying out some version of the AADC in the U.S. makes sense - it provides underlying, unwaivable privacy protections for kids, eliminating the need for a consent-centric framework that doesn't work. However, there are issues: U.S. parents don't want government to override their decision on what content or technology is inappropriate for their children to access, and there are serious First Amendment limitations. Jessica Rich, former director of the FTC's Bureau of Consumer Protection, discusses some of the other potential issues in this blog.
Like it or not, the AADC is coming to a theater near you. The UK ICO started enforcing the Code as of September 2021, and we're seeing massive changes made by big tech companies to come into compliance. Importantly, AADC doesn't just apply to the narrow "actual knowledge" standard of COPPA; it also applies to services likely to be accessed by children. Unless your site is just for adults, you might be swept into its requirements.
- Student activity monitoring software may be misused for disciplinary purposes and result in increased contact with law enforcement;
- Software companies have not taken any steps to determine whether student activity monitoring software disproportionately targets students from marginalized groups, leaving schools in the dark;
- Schools, parents, and communities are not being appropriately informed of the use; and potential misuse of the data; and
- Regulatory and legal gaps exacerbate the risks of student activity monitoring software.
Child privacy is actively being discussed in DC after the SOTU mention of child privacy (which was included in the President's mental health initiatives). The Kids Online Safety Act (not to be confused with the KIDS Act) "would require social media platforms put the interests of children first by requiring platforms to make safety the default and to give kids and parents tools to help prevent the destructive impact of social media" and "ensures that parents and policymakers can assess whether social media platforms are taking meaningful steps to address risks to kids." While it is important to update U.S. child privacy laws, this privacy bill has some serious privacy-reducing implications, as EFF points out in this article.
Privacy as the solution to teen mental health challenges is a growing trend. There are correlations (and causative factors), especially when we see dark patterns being used. However, to truly protect child privacy, we have to think less about the internet as "tobacco" that should be banned for kids and more like "car safety" that needs to be regulated.
The FTC and DOJ announced a settlement agreement with WW International after "charging them with violating COPPA for improperly collecting health information and other data from children as young as eight years old. The settlement requires the deletion of all affected work product - which includes algorithms - resulting from the companies collection of children's data." This is the third FTC settlement to require algorithmic disgorgement, but the first one in the COPPA context.
Requiring that algorithms derived from (potentially) ill-gotten gains be deleted is a pretty great deterrent for companies to make sure they are compliant with Section 5 and/or COPPA, and it seems likely that the FTC will continue to require this as part of settlements moving forward. However, as this article discusses, deleting algorithms may be more complicated than it sounds.
Parental rights legislation is being discussed all over the country (and federally), often requiring that "parents to be notified when their children borrow items or that block a student's ability to check out specified materials at their parent's request." Follett, which runs the library management system used by most K-12 school libraries, is creating an "optional functionality that can meet these new requirements." As you might imagine, privacy-protective librarians are up in arms.
Resources Worth Your Time
As discussed above, policymakers everywhere are focusing on child privacy online. However, this isn't new: many of the ideas being discussed today were passed into law in the 90s and found to be unconstitutional. This new Congressional Research Service report gives a great overview of child privacy and legal concepts implicated in the content-based regulation of expression under the First Amendment.
The Surveillance Technology Oversight Project (S.T.O.P.) has launched "an education campaign detailing the dangers and responses to smart city technology" to help "activists, lawmakers, and members of the public identify common risks from expanding municipal technology in policing, education, and transit."
S.T.O.P. has become a bigger and bigger player in the student privacy space over the past 3+ years, and this toolkit has the potential to be the default talking points used by activists and policymakers moving forward.