© 2024 WFSU Public Media
WFSU News · Tallahassee · Panama City · Thomasville
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Users beware: Apps are using a loophole in privacy law to track kids' phones

Elva Etienne
/
Getty Images

Washington Post technology columnist Geoffrey Fowler says smartphones and apps are harvesting our personal data — and that of our kids — on a scale that would shock most users. By the time a child is 13, he writes, online advertising firms have collected an average of 72 million data points about that individual.

"Companies out there that you would never know the name of, that really have no relationship with the app that you were trying to use, could be, first of all, tracking your kid's interests, then trying to predict what they might want to buy, or sell their information on to others," Fowler says.

Fowler started out as a technology reviewer, looking at new gadgets and evaluating if they were worth their price. But, he says, as technology has become more mainstream, he began to see his role as reviewer differently: "It's no longer a question of, 'Is it too hard to use?' The question now has become, 'Is it evil?' 'Is it taking away our rights and our choices?' "

Fowler's ongoing series for the Post, "We the Users," attempts to answer these questions — and to raise awareness of how pervasive the issues are. He says the 1998 Children's Online Privacy Protection Act stipulates that a company has to have actual knowledge that a child is using the app or website in order for certain privacy protections to kick in. But many companies get around the law, simply by claiming that they don't know who their users are. Fowler advocates for closing this loophole, and for creating new laws that allow companies to collect only the data they need — and nothing more.

"I think that's what most consumers already assume is happening: that if you ask a website to show you a map, it's collecting your location just for that moment to give you directions," Fowler says. "The problem is that that's not what's happening. These companies are taking it as an opportunity to then collect your data all the time and do what they want to with it."

Fowler says while each individual case of data collection might seem insignificant in the moment, the larger picture is anything but: "Once it's collected, it's out of your control, it could be used in all kinds of ways that we can't even imagine yet."


Interview highlights

On how apps are spying on children on an alarming scale

I work with some researchers at a company called Pixalate, and they looked at this question on a really broad scale. So they tried to categorize all of the apps that exist that might be appealing to kids. And then they tracked what happened to the personal data that those apps were collecting: things like ways to identify the phone, the location of the phone. They found that more than two thirds of [the apps] on iPhones were sending this information off to the advertising industry. It was an even higher number — 79% — on Android phones. What shocked me about this is that we have a law in America that's supposed to protect the privacy of children — and yet this is happening.

On why age ratings on apps don't protect children from data collection

Apps in both the Google and Android app stores all have to have an age rating on them, which is for how violent the content is, or how "adult" it might be in nature. So you'll see those ratings in the store. The problem here is that those ratings have nothing to do with whether or not those apps are collecting data about children. The law that we have in the United States, called COPPA, the Children's Online [Privacy] Protection Act, says that pretty clearly if someone is under 13, companies are not supposed to collect data about them without their parent's explicit permission. But the problem is that this giant industry of app developers, and also Apple and Google who run these app stores and make billions of dollars off of it, have found some really big loopholes in that law. So they're doing it anyway.

On the loophole in the 1998 Children's Online Privacy Protection Act

Many of [the app developers] then just claim, "We don't know who's using our app. It could be adults." Or they'll say, "We're really not marketing this coloring app or this math homework assistance app to children. We're marketing it to adults." And Apple and Google, who run these app stores and are sort of the de facto police for them, let them get away with it.

Kids use all kinds of things. The app stores that they have available to them on their phones are just the same as the app stores that adults have. So they want to play the same games that we want to play. Oftentimes that's things like Angry Birds and Candy Crush. ... They want to do a lot of the same things we do. And these kinds of apps are all claiming that they are general audience apps, which means that they're made for adults rather than acknowledging that actually, kids are going to be interested in this stuff, too, so you ought to treat them differently.

On the responsibility Apple and Google have to stop this collecting data from kids

Apple and Google are the de facto cops for apps in our world. They get to decide what goes in those stores and they actually go to Washington all the time right now arguing that they alone should be allowed to run these stores, even though they're kind of like monopolies because they alone can protect our privacy and our security and protect our kids. So if they're going to say that, then they really ought to force apps to tell the truth about whether children are potentially using the apps and if so, treat them differently.

On ideas for how to solve this problem

Another idea that came from, of all places, Instagram, which is one of the apps that we've been talking a lot about as causing problems for kids, is to have the phone itself know whether or not a kid is using it. So the idea is when a parent sets up an iPhone or an Android phone for their child, they enter in the age of the child. And so if that age is under 13, then it would send a signal out to apps that would say, "Hey, there is a kid here. Do not collect data unless you get parental permission." I think that would be really useful in a lot of different ways.

On what happens when you hit "ask app to not track" in the app settings

When you press the "ask app not to track" button, ... you're putting in a request not to track you, but you're not exactly shutting off the system that would be used to potentially track you. So what you're doing is you're stopping the app from using one particular kind of ID that exists on your phone. It's called the IDFA. And it was actually made by Apple, built into the iPhone a long time ago. And it's just a code that allows apps to know who you are across different apps. That's really helpful for advertisers, for example, who might want to show you the same ad for fancy underwear that you see in one app and then show it to you on another website or in another app.

So when you press the "ask app not to track button," it says you can't grab that form of ID anymore, but it doesn't really do anything to stop all of the other kinds of data that can still be used to identify you, that apps might want to grab. ... It's better than nothing. So, yeah, take advantage of everything you can out there because it's, you know, it's a battle.

On Fowler reading every Terms & Conditions policy for every app on his phone.

Our problem right now is that we're just overwhelmed by data collection and this model that's built into American law and the economy that we, the users, are somehow consenting to.

It was a million words. And just to give you a little context, that's about twice the length of War and Peace. There is no way that any normal functioning person is going to have time enough to read that even once, much less keep reading it as these companies tweak the language and update them all the time. It's just nuts. But unfortunately, it is the basis of how our privacy is supposed to be protected in the U.S. Our problem right now is that we're just overwhelmed by data collection and this model that's built into American law and the economy that we, the users, are somehow consenting to. Each and every one of these data uses is completely broken. In fact, it's really mean to us as consumers. It's not really fair. It puts the onus on us, puts it on our shoulders that if something happens with our data that we didn't like or something bad happens to our data, it's our fault for consenting all along. And I just think that's really, really broken.

On how data collection could be weaponized against women in a post-Roe v. Wade world

Your phone knows a lot about you, and so it would know if you were searching for information about where to get an abortion. It might know if you were at a clinic. It might know the history of your fertility cycle, because a lot of people use cycle tracking apps. All of this data could be used against you if you happen to be in a state where seeking an abortion becomes against the law. There is some precedent for this already that search histories and other information have been used to try to show that women were guilty of not caring for their fetuses or leading to the death of a baby. And the thing that I think people forget is any time that a company collects information about you, the government can get access to that information either by issuing a court order or, increasingly, just by buying it. We're talking about a giant industry economy of selling people's data, so increasingly, we are seeing the government go and do that to gather evidence and to try to prosecute crimes.

Amy Salit and Thea Chaloner produced and edited the audio of this interview. Bridget Bentz, Molly Seavy-Nesper and Natalie Escobar adapted it for the web.

Copyright 2022 Fresh Air. To see more, visit Fresh Air.

Dave Davies is a guest host for NPR's Fresh Air with Terry Gross.