Are social media giants doing enough to prevent the spread of misinformation?

JUDY WOODRUFF: On an average day, over 125
million people use Twitter. An estimated 2.3 billion use Facebook. We know these remarkable communication tools
are also used by a growing number of people as their main sources for news and information. But, as William Brangham reports, a new book
shows us how social media platforms and apps can be harnessed to spread some very dark
ideas very quickly. It’s the latest in our “NewsHour” Bookshelf. WILLIAM BRANGHAM: The creators of online platforms
like Facebook and Twitter and Reddit all described themselves at first as having one overarching
goal, creating a space for freewheeling, open connections to friends and ideas from all
over the Internet. And in the process, these Silicon Valley entrepreneurs
built some of the most powerful tools for spreading information that the world has ever
seen. But in his new book, “New Yorker” writer Andrew
Marantz shows us how these techno-utopians, as he calls them, built these platforms full
of unforeseen vulnerabilities, and how a group of racists and vandals have used those vulnerabilities
to — quote — “throw the whole information ecosystem into chaos.” The book is called “Antisocial: Online Extremists,
Techno-Utopians, and the Hijacking of the American Conversation.” And Andrew Marantz joins me now. Welcome to the “NewsHour.” ANDREW MARANTZ, Author, “Antisocial: Online
Extremists, Techno-Utopians, and the Hijacking of the American Conversation”: Thanks. Thanks for having me. WILLIAM BRANGHAM: Let’s talk about these platforms
at first, the Twitters and Facebook and Reddits of the world. What are those vulnerabilities that you document,
that you say that have been hijacked by these other groups? ANDREW MARANTZ: Well, the biggest vulnerability
is also one of the biggest strengths of these platforms, which is their openness, their
tolerance of all points of view. I make an analogy to a big party. When you throw open the doors to a party that
you’re hosting, one way to keep it fun and exciting and novel is to not be overly micromanaging,
not police everything, just sort of let it fly. WILLIAM BRANGHAM: Turn the music down. Don’t drink that. Don’t smoke here. ANDREW MARANTZ: Yes, that’s not a good party. You want to sort of let everything fly. And you also want to be pure and ideologically
consistent as the host of a party. And the easiest way to be consistent is to
basically do nothing. And so a lot of these platforms started out
as techno-libertarian, techno-utopian, sort of just saying, we’re not going to police
anything anybody does. If we get told of specific lawbreaking, then
maybe we will take that under control, but anything else, we’re just sort of going to
let it ride. And the reason that I call them techno-utopians
is, there was this built-in assumption, sometimes implicit, sometimes explicit, that that would
ultimately redound to the good, that the arc of history would naturally automatically bend
toward justice, the more speech, the better. And in some cases, that was true. There were lots of useful social movements
that were sparked and helped along by social media. But there was also an antisocial side, to
quote the title, along with the pro-social side. And there was just this halo effect where,
for the first 10 years or so, people didn’t seem to talk about the antisocial side of
the social media atmosphere very much. WILLIAM BRANGHAM: Or even acknowledge that
it existed. ANDREW MARANTZ: Right. And, all of a sudden, our blindness to that
just sort of came crashing down upon us. WILLIAM BRANGHAM: You spend a good deal of
the book, the bulk of the book, really, with the members of the so-called alt-right, this
loose conglomeration of racists and anti-semites and misogynists, some more so, some less so. How is it and what is it that they did with
these platforms that is so troubling to you? ANDREW MARANTZ: Well, so I didn’t go into
this looking for the worst people on the Internet. I ended up finding them. But I didn’t go there looking for that. I was looking for an example of, what’s the
worst that could happen? And it started getting non-hypothetical really
quickly when I started looking for open racists, open misogynists, people who were expert propagandists. I pretty much found whatever I was looking
for. And we’re focused, as I think we should be,
on what the Russians did to meddle in the 2016 election, what they and the Iranians
and the Chinese and others might do in the 2020 election. I’m — I think, as I say, it’s to the good
that we focus on that. But, in the 2016 election, there were Americans
who were not anonymous, who you didn’t need a subpoena to go find, who were meddling in
our election way more than the Russians were. And when I went to go ask them how they did
it, they showed me. They let me just sit in their living rooms
and watch as they did it. For instance, there was one guy in Orange
County, California, who just sort of invited me in and said, OK, pull up a chair. Today, we’re going to start a rumor about
Hillary Clinton. WILLIAM BRANGHAM: This is Mike Cernovich. ANDREW MARANTZ: Right. And so, multiple times a day, he would say,
I want people to think Hillary Clinton has some mysterious disease that she’s not talking
about, or I want to talk about her e-mails, or whatever the case may be. And he could just inject that into the news
stream by starting a Periscope, getting a hashtag trending on Twitter. He had broken down the step-by-step way that
you infiltrate the news cycle, basically, to the point that I could then pick up the
newspaper the next day and go, that story is in the newspaper because of what I watched
this one guy do by rallying his fans on Twitter the day before. And that’s freedom. That’s democracy. But it’s also — I mean, as you will see in
the book, that is not a guy whose fingerprints you want on the national discourse. WILLIAM BRANGHAM: We, as you document, have
always had fringe characters in American politics. Is it your sense that these platforms amplify
those voices and simply give us a better look at them? Or is it actually creating more of them? Is it is it enlisting new soldiers in their
fight? ANDREW MARANTZ: Yes, so the platforms do change
things. I think, sometimes, the platforms take refuge
in this idea that — the true idea that there has always been racism and bigotry and misogyny. But what they’re leaving out is that when
you incentivize shock and fear and disgust and all these emotions, when you… WILLIAM BRANGHAM: Quite literally in the algorithm. ANDREW MARANTZ: Yes. When you incentivize it, when you create literal
points, as if you’re playing a video game, and the more salacious words you say, the
more points you get, the playing field has been tilted by these algorithms. There’s no pure neutrality when you build
a tool, especially when you build a tool that then becomes so hugely, revolutionarily important
to how people communicate and how people think. And I think the informational crisis is kind
of as big a deal as the climate crisis or the city infrastructure crisis, because if
we don’t know how to think and talk and learn how to arm ourselves with information, we
can’t then address any of those other crises that we’re facing. WILLIAM BRANGHAM: Facebook in particular,
but also Twitter and Reddit and many of these other platforms, have said, OK, we get it. We get it that there’s a problem. We’re trying to moderate. We’re trying to police this better. What do you make of their efforts? And do you think they’re doing enough? ANDREW MARANTZ: I think it’s better that they’re
doing something than nothing. For a long time, they were essentially not
doing any of this. They’re not doing enough yet. And they — they need to be pushed to do a
lot more. When you’re someone like Mark Zuckerberg,
who has built your entire adult life and career and fortune on the idea that, just by virtue
of doing more of what you’re doing, you will make the world a better place, it’s an article
of faith at this point. It seems like there’s almost nothing that
can dislodge that belief, which is really, really dangerous, because these tools, there
are massive, massive harms that are being propagated on these tools every day, I mean,
sparking genocides in various parts of the world, I mean, real, tangible harms. And if we can’t even acknowledge those harms
without being told that we don’t respect freedom.. WILLIAM BRANGHAM: You’re some kind of a Luddite. ANDREW MARANTZ: Yes, you’re a Luddite, or
you don’t respect freedom of speech. It’s just not a good argument. I think it’s — freedom of speech is very
important. I’m a journalist. I love the First Amendment. But Facebook has a lot of responsibilities
and rights to curb this stuff. They have the resources to do it. And at this point, they’re sort of just using
it as an excuse to not do it. WILLIAM BRANGHAM: The book is “Antisocial:
Online Extremists, Techno-Utopians, and the Hijacking of the American Conversation.” Andrew Marantz, thank you very much. ANDREW MARANTZ: Thank you.

Leave a Reply

Your email address will not be published. Required fields are marked *