© 2024 WFSU Public Media
WFSU News · Tallahassee · Panama City · Thomasville
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

For Facebook Content Moderators, Traumatizing Material Is A Job Hazard

TERRY GROSS, HOST:

This is FRESH AIR. I'm Terry Gross. In order to prevent you from being exposed to graphic violence, pornography and hate speech on your Facebook feed, there are thousands of people who are paid to view objectionable posts and decide which need to be removed. These content moderators are described by my guest, journalist Casey Newton, as first responders performing a critical function on a platform with billions of users. He's been investigating the working conditions of these content moderators. He's interviewed current and former employees. And for the first time, three former employees have broken their nondisclosure agreements and spoken on the record. Newton reports that repeated exposure to these traumatizing posts has left some content moderators with mental health problems. As we'll hear, there are other problems in the workplace, too.

Facebook contracts with other companies to hire the content moderators and provide office space. Originally, these sites were in other countries, but the operations have expanded to the U.S., including sites in California, Arizona, Texas and Florida. Later, we'll talk about how Facebook is preparing to launch what's been described as a Supreme Court to rule on contested content decisions. And we'll hear what Facebook's been doing in the lead-up to the 2020 election to try to protect users from disinformation and propaganda from Russia and other countries.

Casey Newton covers the Silicon Valley for the tech site The Verge and writes the daily newsletter The Interface about the intersection of social media and democracy. Casey Newton, welcome to FRESH AIR. Give us a sense of what the content moderators have to look at in one typical day at work.

CASEY NEWTON: Sure. So they are sitting at a computer workstation. They click a button that indicates they're ready to review content, and then they see a steady stream of text posts, images, videos. And they can be absolutely anything. Much of it will be benign - just words on a screen that someone didn't like. But some of it will be incredibly disturbing, some of the worst things imaginable. So moderators have described to me seeing, on a regular basis, beheadings, murders, animal abuse, child exploitation. And often, it arrives without warning. And so over the course of a long day, you sort of never know when it might be that you see the thing that is just going to ruin your day and might haunt you for a long time after.

GROSS: So you spoke to a lot of content moderators. Some of them broke their nondisclosure agreements to speak with you. What did they tell you about the psychological and emotional impact of watching some of the brutality and pornography and reading the hate that they have to look at on a daily level to do their job?

NEWTON: Well, as you might imagine, it's just really hard for them. The folks who I've spoken to, the ones who have reached out to me, have told me that this is work that has continued to haunt them even months or years after they have left the job. There is something traumatizing about viewing these images. We're empathetic people. We see other people or we see animals suffering - we're going to absorb some of that.

And I've talked to some moderators in particular who were placed in queues of content where they were seeing much more graphic violence than other folks were. So, you know, maybe the average moderator might only see a handful of things a day that an average person would find really upsetting. I've talked to some people who are seeing dozens or even more than a hundred of those images. And as a result, I've talked to folks who will wake up in the middle of the night in a cold sweat. They will have nightmares about the content that they saw. And eventually, many of them get diagnosed with post-traumatic stress disorder, you know, almost a diagnosis that you might expect from somebody who had, you know, been into a war zone.

There are - there's one other effect that I would say caught me by surprise in my reporting, which is I've talked to moderators who said that once they looked at so many fringe views and conspiracy theories, they started to believe those views themselves. And so I've talked to moderators who worked in workplaces where managers will stalk the floor, kind of promoting the idea that the earth is flat. I talked to somebody who said that he no longer believed that 9/11 was a terrorist attack. And so this content - just kind of that repeated daily exposure - it can change the way that these moderators see the world in really unexpected ways.

GROSS: I'm really surprised to hear that because, you know, you'd almost think it would be the other way around because they're supposed to be tutored in how to spot a conspiracy theory. And you'd think that they'd want to debunk it and not start subscribing to it.

NEWTON: Sure. Although, in the training that these moderators received, they don't get trained on resiliency to fringe ideas, right? Like, what they get trained on is, should this stay up? Or should this come down? And they are generally not prepared for the idea that they're going to be subjected to a lot of fringe and conspiracy views.

I remember one sort of chilling story where in the immediate aftermath of the Parkland shooting - which is right when Facebook was starting to kind of ramp up its use of moderators in America. In the immediate aftermath, moderators were very upset by all of the videos that people were uploading of the violence. But then conspiracy sites started to create videos saying, well, that shooting never happened. These kids are crisis actors. And the discussion on the floor became, oh, well, gosh. I guess it didn't happen. I guess it was all a big hoax. And I actually am empathetic to those moderators because that was the only information they were getting about this shooting. They were spending eight hours a day at work, and every time they clicked, you know, to a new post, they were seeing something saying that the Parkland shooting was all made up.

So, you know, I think in ways that we're only beginning to understand, we really are shaped by our media environment and repeated exposure to the same idea. And so if all you're seeing is fringe views, I think, eventually, some of that is going to seep into your beliefs.

GROSS: Is there no attempt to put some of these posts in a larger social, political, gender, racial, ethnic, cultural context for the content moderators?

NEWTON: Well - so one of the ideas behind bringing these moderators to America was exactly that - that they would have a kind of natural context for understanding posts that would make them more adept at moderating content that had been created by Americans. You'd sort of know who the politicians are around here. Or what are the big social issues that people are discussing? But in practice, that happens less than you might expect. Lot of the folks who got hired to do these jobs don't have a secondary education. A lot of them struggle with even basic questions of citizenship, I would argue. So moderators will tell me that their colleagues don't know who Senator Ted Cruz is, for example. Or they don't know what a refugee is.

And so even though the idea of bringing these moderators to America was that they would have a better natural understanding of these issues, it turns out that they really don't. And also, Facebook doesn't provide a kind of ongoing cultural education for them to bring them up to speed with one exception, and that's when a particular issue kind of goes viral on Facebook, and there's a sudden need to catch everybody up in real time. But, you know, trying to explain the entire culture to a global army of 15,000 people is a really tall order. And honestly, I just think Facebook hasn't even started to figure out how it might try to do that.

GROSS: So getting back to the, like, emotional and psychological impact of watching all these, like, violent and pornographic and hate-filled posts, you heard several stories about people who ended up buying guns for various reasons but for reasons that seemed related to the work. Can you give us an example?

NEWTON: Yeah. I remember talking to one man who himself had started to embrace some of these conspiracy views. He was the one who said that he no longer believed that 9/11 had happened. But he had a job at one of these sites that is known internally as a QA or a quality analyst. And it's the job of that person to be a check on the moderators. They'll sort of look at a subset of those moderators' decisions and kind of check through them and see if they're right or wrong.

And at these sites, accuracy is the single most important way that the moderators are evaluated. And if they miss even a handful of questions in a week, they can be put on a performance improvement plan. And they can eventually lose their job. So the moderators are under a great deal of stress to get everything right. And so this person, who I call Randy in my story, was a QA and would sometimes mark people wrong because he thought that they had made the wrong decision. And those folks would then come up to him in the parking lot, and they would threaten him and try to intimidate him into changing his ruling. And so it was after that that he started bringing a gun to work because he was afraid for his life.

Outside of that, moderators have told me at both of the sites that I visited that when moderators do get fired or sometimes before they get fired, they will create posts either on Facebook itself or in an internal forum in which they threaten the lives of their co-workers. And they say, you know, I'm thinking about coming back and shooting this place up. So there's just always this undercurrent of anxiety in these offices. And it does inspire a small number of people to start bringing guns to work to protect themselves.

GROSS: You know, when you mentioned the pressure to be accurate - and people are supposed to be 98% accurate - accuracy is what? It's making the right decision on whether to delete a post or not?

NEWTON: Yes. The literal way that Facebook defines accuracy is, did you make the right decision? Should you - you know, did you take down the things that should be taken down? And did you leave up what ought to be left up? But in practice, the question is really more about, did the QA agree with the moderator when they did the weekly audit? And this is where some moderators I've spoken with have said that the system itself is somewhat suspect because, you know, as you can imagine, many, many of these decisions involve making subjective judgments. There is no policy that can account for every imaginable variation.

And so you'll have a lot of really hard-working, well-meaning moderators making the decision that they think is right based on their, you know, mostly objective view. And then their boss will essentially come in and say, well, I don't think you're right. And it just leads to all kinds of anxiety. But what a lot of the moderators I've spoken with really want to get across is how subjective some of these decisions can be.

GROSS: Let's take a short break here. And then we'll talk some more. If you're just joining us my guest is Casey Newton. And he covers Silicon Valley for the tech site The Verge. We'll be right back after a break. This is FRESH AIR.

(SOUNDBITE OF JOAN JEANRENAUD'S "AXIS")

GROSS: This is FRESH AIR. And if you're just joining us, my guest is Casey Newton. He covers Silicon Valley for the tech site The Verge and writes the daily newsletter The Interface about the intersection of social media and democracy. He's written two recent articles for The Verge, reporting on the working conditions of Facebook content moderators who purge the violent, pornographic and hate-speech posts from Facebook.

So we've been talking about, like, the emotional and psychological effects on the content moderators of seeing all these, like, you know, hateful and violent and pornographic videos. But the physical working conditions are difficult, too. Tell us about, like, what they're allowed in terms of, you know, breaks and lunches and things like that.

NEWTON: Yeah. To me, it is a huge part of this story because, as hard as the work itself is, I believe it's made exponentially harder by the physical environments of the job and the - sort of the way that it's structured. So, you know, in terms of breaks, they get two 15-minute breaks, a 30-minute lunch and nine minutes of something called wellness time.

And the idea of wellness time is that if you see something that really overwhelms you, you're going to want to probably get up and walk away from your desk. And so Facebook will have these contractors offer various services. Like, maybe there's a yoga class or a ping-pong table. They also have counselors who are onsite for at least some of the shifts that workers can talk to. But as you might imagine, nine minutes isn't often enough to discuss the full weight of the issues that you're dealing with. And so that nine-minute limit has confused a lot of the moderators that I've spoken with.

Maybe of equal concern, at least for some of the folks I've spoken with, is just the physical environment itself. And at least at these two sites I visited - one in Phoenix, one in Tampa - I've been told that they are very dirty, that the bathrooms are disgusting, that there's sort of human waste everywhere on a regular basis. People are finding bodily waste at their workstations because they sort of share workstations, and they're not always kept very clean. And so people coming to work will just kind of say that they don't like being in these offices because they don't feel like professional work environments.

People will tell me that they see fights, both physical and verbal, between associates. People will be smoking weed in the parking lot. People at every office I've talked to say that their colleagues have sex in the break rooms, in the rooms that are reserved for mothers to lactate. And so it's just a very chaotic environment. And, you know, you're trying to make hundreds of these often very nuanced decisions a day about content moderation. But just something as simple as, am I going to have a clean bathroom to use? - really weighs on these moderators' minds. And so all of it adds up to a situation that feels untenable for some of the moderators I've spoken with.

GROSS: What about the pay people receive?

NEWTON: So in the United States, a moderator will be paid $15 an hour, which is $28,800 a year. That compares with a median pay package at Facebook that last year was $240,000 if you include salary, stock and bonuses. So there is a huge discrepancy there. But I think that tells you the value that Facebook places on this work, relatively speaking.

GROSS: Now, the conditions that you've described and the salary you've described, really, it doesn't sound like what people expect that Facebook employees are faced with. But these people, the content moderators, aren't literally Facebook employees because Facebook contracts this work to other companies. Can you describe the arrangement Facebook has with the companies that actually hire and house the content moderators?

NEWTON: Yes. So the model for this work is a call center, right? So let's say maybe you, you know, buy an appliance somewhere, and you're having a delivery issue, and you need to call someone to get it straightened out; the call goes to a call center. The exact same model is used for Facebook content moderation. Call center work in the United States is typically low-wage work. You get a bunch of bodies. You put them in a big office. And your goal is to just keep them as busy as possible, answering as many tickets as come in as possible.

One of the things I've tried to highlight in my stories is that while we pay these folks as if the work is low-skill labor, in many cases, in my opinion, it is very high-skilled labor because they're making these very nuanced judgments about the boundaries of speech on the Internet. So if you accept that Facebook and Instagram, these big services that these folks are moderating, represent an essential component of political speech in this day and age, my guess is you might want them to be paid more than $28,000 a year.

GROSS: And are they full-time employees or are they, like, part-time?

NEWTON: They are full-time employees, and Facebook will take pains to tell you that it treats them far better than it treats the average employee of the call center industry. So they get full health benefits, for example. They get mental health benefits that go beyond what you would see at a call center. So there is, like, a 24-hour hotline that folks can call. So it definitely does provide more resources to employees than you would find in the average call center. But one of the points that I've tried to raise in my reporting is just maybe we should not be using a call center model for this kind of work.

GROSS: What did Facebook and the subcontractor who runs two of the sites in America, the two that you visited - and that company's name is Cognizant - what did they have to say in response to your report on working conditions for the content moderators?

NEWTON: So when I wrote the first story, Facebook did a big internal blog post where they sort of outlined changes that they were going to make in the future. And I want to highlight a few of them because I actually think they're positive and important, and some of them are somewhat theoretical (laughter). But the first one is, Facebook has said it's going to give a $3 an hour raise to these contractors, so they would make a minimum of $18 an hour, which I think comes out to more like $36,000 a year. That's a really good start. I don't think it's enough. I think it's a good start.

Two - they said they were going to do a better job screening their candidates going forward. This gets tricky with employment law. But Facebook believes there is probably some way to inquire about people's mental health or willingness to do a job that might prevent them from hiring folks who are more likely to get PTSD from doing this kind of work.

And then the third and final thing - which, you know, is sort of in the earliest discussions but I think would be so important - is after people leave this job, either because they're fired or because they quit, somehow making available for them some kind of counseling so that there will be someone to talk to, and Facebook would cover the cost. You know, if you get PTSD looking at Facebook, I think there is, you know, some thought that maybe Facebook should be the one paying for your counselor. So that has kind of been the word from Facebook on how they would like to handle these things in the future.

At the earliest, though, that pay increase, it's not expected to go into effect until next summer. So we have a long way to go, right? There is sort of a year to go before that one major change is going to take place. When it comes to Cognizant, I think they've sort of presented the stories that I've found as, well, like, what can you do? We've got a lot of people working in an office. Things are going to break. Bad things are going to happen. And we try to take care of them, you know, when we find out about them. But the workers that, you know, continue to text me and email me every day, tell me that they have not found that response sufficient.

GROSS: My guest is Casey Newton. He covers the Silicon Valley for the tech site The Verge. After a break, we'll talk about the so-called Supreme Court that Facebook is trying to create for appealing controversial content decisions, and we'll discuss what Facebook has been doing to prepare for the 2020 election. And Ken Tucker will review country hits with a hip-hop sensibility by Lil Nas X and Blanco Brown. I'm Terry Gross, and this is FRESH AIR.

(SOUNDBITE OF DOMINIC MILLER'S "CHAOS THEORY")

GROSS: This is FRESH AIR. I'm Terry Gross. Let's get back to my interview with Casey Newton about Facebook and its content moderators. Those are the people who view posts that have been flagged as objectionable, including violent videos, pornography and hate speech, and decide which posts to remove. Newton has written two articles reporting on the content moderators' working conditions and the psychological impact of being repeatedly exposed to traumatizing content. Facebook contracts with other companies to employ, supervise and provide the office space for the moderators. These sites are located in several other countries as well as in the U.S. Casey Newton covers the Silicon Valley for the tech site The Verge and writes the newsletter The Interface about the intersection of social media and democracy.

Facebook has had, I think, a big internal debate, and there's been a big national debate or international debate around Facebook about whether it's more like, say, the phone company that is a communications network that doesn't monitor the signals that are sent across it - it's not the job of any of the phone companies to listen into what you're saying and decide whether it passes certain standards or not - or whether Facebook is more like a newspaper or a magazine, a content provider where Facebook has some responsibility for what is posted on the site that it owns and profits from. So has Facebook been shifting in its perception of what it is and what its responsibility is to its users in terms of the content that Facebook is exposing them to?

NEWTON: I think so. You know, there's a - there are a lot of journalists over the years who have pressed Facebook to admit it's a media company rather than a technology company, which I think kind of gets to the heart of the debate that you just described. And Mark Zuckerberg, I think, kind of ended that argument when he said that he actually believes Facebook has more responsibilities than a media company, or it has more responsibilities than, you know, a newspaper or a magazine precisely because of its reach.

And so that's why starting in 2017 they announced that they would hire tens of thousands of more people to work on safety and security. So there are now 30,000 people at Facebook who work on safety and security, and about half of those are content moderators. So I do think that Facebook is taking its role in public affairs much more seriously than it was, say, five years ago.

GROSS: Was there a turning point, like a specific post or posts, that led them to decide, no, we need more content moderators; we need to take more responsibility for what is on Facebook?

NEWTON: Absolutely. It was the - the aftermath of the 2016 election completely changed public perception of Facebook, and in time, it changed Facebook's conception of itself. So in the 2016 election, Russians attacked us, right? They attacked our democracy. They used Facebook to spread various inflammatory ideas, and Facebook was caught flat-footed. They didn't realize there were so many people creating fake accounts. They didn't realize that those fake accounts were creating fake events that real Americans were actually showing up to. And they didn't realize how fake news, hoaxes were spreading on the site and, in many cases, receiving more views than legitimate, credible information.

And so you just had this maelstrom, and Mark Zuckerberg and Facebook were under enormous pressure to do something about it. And so the story of Facebook since the end of 2016 has been making up for all of the failures that led up to the 2016 presidential election and trying to ensure that a similar set of circumstances doesn't unfold again.

GROSS: So Facebook is in the process of creating a content oversight board that's being described by a lot of people as, like, a Facebook Supreme Court for final judgment calls on content decisions and whether content should be purged or left on the site. And last Thursday, Facebook released a report summarizing the input that it got from a series of public listening meetings around the world.

So describe what we know about what Facebook thinks the role of the content oversight board will be once it's created. And I know this is a work in progress, and Facebook doesn't really know yet. And it's still processing all the feedback and input it's gotten, but what do we know so far?

NEWTON: Yeah, I mean, I think the first thing to say is just what an absolutely wild idea this is, right? So you have this private company that hosts a bunch of user posts. And they have decided that they're going to create some kind of body that is independent from the private company, and that body will make the decision about what can stay on the site and what needs to come down.

And so to kind of imagine what this might look like, you may recall a few weeks back there was a big controversy because a video had gone viral on Facebook that appeared to show House Speaker Nancy Pelosi slurring her words. She looks sort of drunk. And it turned out that somebody had just altered the video to make her look drunk when she was not. And so then there was this ferocious debate about whether Facebook should leave the video up on the site, whether they should take it down or whether they should, you know, take some other measures. Ultimately they decided that they would leave it on the site but sort of label it as false.

In the future, this oversight board that Facebook is now creating would be the one that would make that decision. And so the question has been, how do you set up such a board? And the result has been kind of this pseudo-constitutional convention that has been happening around the world for the past several months. And I think it's really fascinating.

GROSS: And last week at the Aspen Ideas Festival, Mark Zuckerberg was interviewed by Cass Sunstein. And Zuckerberg said in the interview that he was hoping more of the industry would join in this project.

NEWTON: Yeah. And I'm actually sympathetic to that because one problem with our tech companies being so large now is that while they have increasing amounts of power over our lives, they are not really accountable to us in any way. And so if you might think about a similar situation on YouTube where I post a video; it gets taken down; maybe my channel gets removed, and maybe that channel was actually my livelihood, there are thousands of people around this world, if not tens of thousands of people, for whom this is the case.

Well, if your channel gets deleted, you have no Supreme Court that you can appeal to. There is no system of jurisprudence. There is no case law. There is no precedent. And so everything just winds up being this ad hoc decision. And so as a result, you have this enormous frustration in that community. There are, you know, constant death threats against YouTube executives.

My hope, if I want to be an optimist about all of this - it's that by devolving some of the power that these tech companies have back to the people, that these systems will be more accountable to their everyday users, that they'll sort of seem legitimate and that we can bring some semblance of democracy to what, you know, will always remain private companies.

GROSS: So Facebook does have a booklet of community standards that's available for the public to read. I have never read it. I imagine most Facebook users haven't read it, but I imagine you have. Can you give a sense of, like, what some of the standards are and in guiding people as, like, what is violent? Like, what is too violent? Like, what is too sexual? What is too hateful and hate speech and has to be taken down?

NEWTON: Well, I think nudity offers a useful place to start. Often these guidelines that get written by companies start as a single page, and somebody at the company - typically not the most senior executive - will just write down a list of things that they do not think should be on their network, right? And it'll usually start with, like, no nudity. And then as the network goes along, a mother will post a photo of herself breastfeeding, and then the company will have to decide whether it wants to continue to host breastfeeding content. And then somebody will say, well, you know, we decided to permit breastfeeding but maybe not in this particular case, right? Or maybe we don't think that a nipple should be visible at this distance.

And so over time, these guidelines just grow and grow and grow. And also over time, as a society, our expectations change, and things that we once thought should never be public all of a sudden we become OK with. So it's a never-ending series of infinitely branching debates, and I think it speaks to the - again, to the difficulty of the job and moderating it but also of writing the rules in the first place, right?

You know, traditionally we have had systems of law to make these decisions about what is allowed and not allowed. You know, in the United States, obviously we have free speech protections. Facebook's, I think, content moderation guidelines were written in the spirit of the First Amendment to maximize the amount of speech. But it's a global network, and most countries do not have the First Amendment. And so it's just been a really chaotic time for the company as it tries to apply this one global standard in places that are not always very receptive to it.

GROSS: Can you think of an example when the rules changed, like, in the course of one day about what was acceptable and what wasn't for posting?

NEWTON: So there was this phenomenon that Facebook was seeing of young girls saying to each other something like, I love this ho, short for hooker. And Facebook's guidelines have traditionally said that you can't call somebody a hooker, but then they realized after getting a lot of angry feedback that these girls were actually speaking positively of their friends and that somehow what had been an insult was now a compliment. And so one day a switch was flipped, and all of a sudden, it was now OK to say, I love you, ho.

GROSS: Was it context - like, did you have to figure out what the context was if you were a content moderator, whether this was a term of affection or whether it was intended as an insult?

NEWTON: That's exactly right. And so, you know, think about this. You're a moderator. You don't know these people. You're looking at one sentence and maybe a couple of comments underneath it, and now it's your job to decide whether, you know, Suzy (ph) likes Jenny (ph) or not.

GROSS: Right (laughter).

NEWTON: It's very difficult.

GROSS: (Laughter) OK. So an example that I read about in terms of content moderation is that the - a post that said autistic people should be sterilized was not taken down because autistic people aren't a protected group. What does that mean?

NEWTON: So Facebook uses a standard similar to a U.S. legal standard where certain classes of people are protected. So, you know, race is a great example - a classic example of a protected class. You're not allowed to say, you know, all members of this race are bad. Autism, though, was not considered a protected class, and so if you wanted to say something that most of us would find really offensive like all autistic people should be sterilized, Facebook would just leave that up in the name of free speech. Interestingly enough, a moderator texted me about that two days ago and said that autism is now considered a protected characteristic, and so you can no longer say that.

GROSS: Let's take a short break here, and then we'll talk some more. If you're just joining us, my guest is Casey Newton. He covers Silicon Valley for the tech site The Verge. We'll be right back. This is FRESH AIR.

(SOUNDBITE OF MUSIC)

GROSS: This is FRESH AIR, and if you're just joining us, my guest is Casey Newton. He covers Silicon Valley for the tech site The Verge, and he writes the daily newsletter The Interface about the intersection of social media and democracy. He's written two recent articles for The Verge reporting on the working conditions of Facebook's content moderators. These are the people who purge the violent, pornographic and hate speech posts.

Social media, including Facebook, was used as a tool of the Russians to subvert our 2016 presidential election, and many Americans are very concerned about what the Russians or the Chinese or the Iranians or other actors might do to try to use social media to subvert the next presidential election. What is Facebook doing to prepare for the election?

NEWTON: So it turns out that one pretty good, easy thing that you can do to reduce the risk of that kind of thing happening is to catch fake accounts. So Facebook has this ironclad rule that every account - you know, you're only allowed to have one account, and it has to be your real name, and that's that - no exceptions.

And so if you're, say, a Russian agent, you probably are not going to be able to create a bunch of posts that go viral in America criticizing an American political candidate because it will just sort of be clear that, you know, you don't belong in this political discussion. But if you can create an account that says, you know, Joe Smith and you live on Main Street, USA, maybe you can create a page that gets a million followers and all of a sudden is able to start posting a lot of inflammatory memes and rhetoric. So that's a major area where Facebook has invested - is just cracking down on fake accounts. They now say that they detect millions of them every week. And of course it's a cat-and-mouse game where the Russians are always getting better, but that is one place where they've been able to focus.

They've also placed a lot of new requirements around ads. So for example, there is now an archive that you can visit to see every single ad that a political candidate bought or even just a political ad that was created by a political action committee. You know, in the 2016 election, we didn't really know what ads people were seeing because the ads could be targeted to, you know, down to the level of one person, and it would just sort of never be publicly searchable. That's not the case anymore. Now you can actually dive in. Researchers are, you know, putting together collections of these ads and studying them in real time. So those are kind of two things that I would highlight as ways that Facebook is trying to detect these, you know, networks of bad actors and disrupt them.

GROSS: So, you know, one of the things social media is trying to contend with now is extremists. For example, Facebook initially banned Alex Jones' network from the site, and then it banned Alex Jones himself. What are the guidelines that Facebook uses to decide who's too extreme to post?

NEWTON: It's a great question and one where the answer has changed pretty significantly over the past year. Last summer was when attention really started to turn on this question of Alex Jones and how he was using social networks not just to reach an audience but to build an audience because all of these networks have recommendation algorithms and search algorithms that can sort of point more and more people at you.

And I think Facebook started to look at the effect that Alex Jones was having on the world. Sort of most famously, there was a family of a victim of the Sandy Hook shooting who had moved seven times because Alex Jones' followers had been moved by him to harass the family of the dead little boy because Alex Jones had released a number of videos in which he suggested that the Sandy Hook shooting had all been faked and was a pretext for the government to take away people's guns.

And so at a certain point, Facebook has to ask itself, why do these rules exist, and why is it that the rules right now seem to be doing a better job of protecting Alex Jones' right to speak his mind than they do the victim of a shooting, you know, whose family can no longer regularly visit their son's grave? And I think as that thinking started to reverberate through the social network, they made the decision that Alex Jones was what they call a hate figure, which means that, you know, not only is he not allowed on the site, but no one can post a statement or a video in which they say, Alex Jones is great, or, you know, sort of express support for his ideas, you know?

But what I think you're touching on is that these are really subjective calls. These are political calls. These are calls that a lot of folks are going to disagree with, and so I think we're really kind of having one of the great reckonings over free speech globally that we've had in a long time. And there isn't one great answer. It's always a question of, how are you going to manage all of the tradeoffs?

GROSS: Since you cover the Silicon Valley and social media on a regular basis and you have a daily newsletter, what are some of the things you're looking to in the near future, like issues that are emerging now or changes that you expect will emerge very soon?

NEWTON: I think we are having a reckoning over the Internet globally because tech companies have gotten so big and so consequential. And so we are reckoning with this power, and you see it in a hundred different ways, whether it's the debates that we're having about facial recognition technology or deepfakes or independent oversight boards. I think you're just seeing this sense that there is not a system of justice that can adequately protect us from how rapid technology is changing the world around us.

And so what I am trying to do is just understand that reckoning a little bit better, every day try to anticipate some of the changes, try to help to frame some of those debates. You know, I don't think you can really overstate how dramatically the Internet has reshaped our lives already but how much it's going to reshape them in the future. And so, you know, I will never have a shortage of things to write about.

GROSS: Casey Newton, thank you so much for your reporting, and thank you so much for talking with us.

NEWTON: Oh, Terry, it was an absolute pleasure.

GROSS: Casey Newton covers the Silicon Valley for the tech site The Verge and writes the newsletter The Interface about the intersection of social media and democracy. After we take a short break, Ken Tucker will review country hits with a hip-hop sensibility by Lil Nas X and Blanco Brown. This is FRESH AIR.

(SOUNDBITE OF CALEXICO'S "PRASKOVIA") Transcript provided by NPR, Copyright NPR.

Combine an intelligent interviewer with a roster of guests that, according to the Chicago Tribune, would be prized by any talk-show host, and you're bound to get an interesting conversation. Fresh Air interviews, though, are in a category by themselves, distinguished by the unique approach of host and executive producer Terry Gross. "A remarkable blend of empathy and warmth, genuine curiosity and sharp intelligence," says the San Francisco Chronicle.