How to Stop Big Tech Surveil Kids: Activist Josh Golin on Child Privacy

 

Reporter Karina Montoya asks a leading privacy activist about Big Tech putting children at risk online.

This interview is part of Open Markets’ Clearly Speaking series.


The COVID-19 pandemic forced classrooms to shift from in-person environments to online platforms. Although an inevitable change given the circumstances, remote learning has also brought dire consequences for privacy protections of kids’ personal data. Just like that of adults’, children’s data is up for the grabs, used as fuel for the business of surveillance advertising

A global Human Rights Watch study showed Big Tech platforms have used ed tech to amass kids’ personal data a “dizzying scale.” That commercialization of pre-teen data, collected without parental consent, breaches U.S. law.   

To better understand efforts to protect children’s privacy rights, Open Markets talked to Josh Golin, executive director of Fairplay, a nonprofit that strives to protect kids from marketing. In recent years, Golin focused on empowering parents and schools to hold corporations to account for misuse of ed tech data while pushing for stronger regulation, including the 1998 Children’s Online Privacy Protection Act, which the Federal Trade Commission recently vowed to enforce more ardently. 

Golin’s comments have been edited for brevity and clarity. 

How Big Tech platforms violate the Children’s Online Privacy Protection Act, when surveilling children’s activities for commercial purposes: 

Around 2014 we [at Fairplay] started focusing more on digital marketing to children, the data collected from them, and the Children's Online Privacy Protection Act. COPPA says that you cannot collect data from children under the age of 13 unless you get verifiable parental consent. That doesn't happen most of the time. The biggest studies looking at COPPA, many of them by [Cal-Berkeley professor] Serge Engleman, have found that more than 50% of children's apps send data to third parties – frequently advertisers – without parental permission. That suggests that the problem is massive, that it has become the industry standard to ignore COPPA.  

About a quarter century of FTC enforcement of COPPA: 

This problem suggests that the FTC hasn’t been good at enforcing COPPA. But I think it's important to separate what the FTC has done before this new commission with Chair [Lina] Khan, and what it has done historically. Historically, it has brought very few COPPA cases on average: about one case a year since the law went into effect in 1998 — the year 2000 would be fairer because that is when the FTC wrote the first rule for COPPA. Frequently, the cases were against small players. The first time they went after a big player was in 2019, with a complaint against Google and YouTube

On the origin and consequences of the COPPA violation case brought against Google and YouTube: 

We, along with the Center for Digital and Democracy, argued that despite YouTube’s terms of service, which say you have to be 13 and over, Google knows YouTube is the No. 1 site for children. YouTube collects a ton of data about all its users, and since there's no parental permission mechanism, they must be violating COPPA. The FTC ultimately agreed with us. They were helped by emails that the attorney general in New York got where they [Google] bragged to advertisers that [YouTube] was a way to reach kids. The fine was not so big, though: $170 million to Google is easy to write off.  

As part of the settlement, the remedy was that creators now have to declare that their videos are child directed, so that YouTube does not collect personal information or do cross-platform tracking. This is the most significant reduction in behavioral advertising to children there has ever been.  

But there were two problems: First, COPPA only applies if the platform has knowledge that the user engaged with their content is under 13. But most kids aren’t engaging with “child-directed content.” Second, the creators have an incentive to lie. As a creator, you make more money if Google sells behavioral ads compared to contextual ads. I have been told by somebody who worked for a very popular creator that the second this the settlement went into effect, they were given strict instructions to say that they are not a “children’s channel.” Instead, they are a “family channel” now. 

On the limitations of COPPA’s consent model to protect kids’ data privacy: 

Schools are allowed to consent on behalf of parents if the data platforms collect is only for an educational purpose and not a commercial one. Unfortunately, neither the FTC nor the Department of Education has never clarified what a commercial purpose is. It’s clear that you can’t collect data from school kids and give it to an advertiser. But where it gets tricky is, what about efforts to improve your product? What about ways to test your products and create new ones? I would argue those are all commercial purposes. If it doesn't immediately benefit my own child's education in that moment, then that the school does not have the right to consent on my behalf.  

On overhauling COPAA to stop Big Tech profiting from kids’ personal data: 

First, the law needs to give protections to teens. In no other legal context [do] we consider a 13-year-old a full-fledged adult. Second, it should apply to all sites, not only child-directed sites, because that is not where kids spend most of their time. Third, we need to get rid of the “knowledge standard.” Currently, if a site is not child-directed, COPPA applies only if the site “knows” there are users under 13 years old. But this has incentivized platforms to pretend they don’t know that. In the U.K. [stricter data privacy protection] applies to sites “likely to be accessed” by children and teens.  

Some of the COPPA update proposals in Congress now would get at some of these issues. Kathy Castor’s bill in the House bans advertising [online] to anybody under the age of 18, it extends all sorts of privacy rights to teens, and it requires platforms that are likely to be accessed by children and teens to do a risk assessment of their data processing to see how it’s impacting kids. The Kids Online Safety Act in the Senate, although is not a privacy bill, requires platforms to work with independent auditors to evaluate the impact of their algorithms on children and teens. 


For more perspective on pressures facing publishers and surveillance advertising, don’t miss the insights from other recent conversations here.