Zuckerberg's views on fake news and Trump's election, annotated
Facebook's News Feed has taken fire throughout the presidential election for surfacing fake news stories. Since the election, critics have even suggested that the fakery on Facebook had led to Trump's election.
During a q-and-a at the Techonomy conference with David Kirkpatrick, the conference's host and an author of a book on Facebook, Zuckerberg addressed this question specifically.
Zuckerberg rebutted the notion that Facebook influenced the election, calling it "pretty out there" and "a pretty crazy idea."
The full one-hour q-and-a is available at the bottom, with most of the first part dedicated to News Feed and what Zuckerberg feels it means for the media and democracy. Below we have transcribed and annotated Zuckerberg's answers on fake news.
You know one of the things post-election, you have been getting a lot of pushback from people who feel that you distorted the way that people perceived the information during the course of the campaign, either because of a filter bubble effect or because you didn't filter out enough fake stories that might have been published simply to gain traction and sell advertising and might have been distorting of people's information that people perceived.
And one of the things that I always wondered — I'd love for you to talk about both of those issues — but you once said to me that Facebook might turn out to be the most transparent company in history, which I don't know if you remember that. But I was so excited when you said it because I thought that was such a great idea because I always believed that your company would have inordinate influence in the world partly because you are such a good leader and because it was such a good idea. And obviously it has continued to be successful.
But what goes on behind the covers is not very well understood, so could we come to understand it better so that people would not be so suspicious about things like those I mentioned?
So when it comes to News Feed ranking, I actually think we're very transparent. Every time we add a new signal or make a change, we publish that. And we explain why we're doing it and what signal we're adding and we bring people in to talk to them about it. So that stuff is out there — and we'll continue to do that. That's a big part of what we do and we take that seriously.
I've seen some of the stories that you are talking about around this election. Personally, I think the idea that fake news on Facebook — of which it is a very small amount of the content — influenced the election in any way I think is a pretty crazy idea.
You know voters make decisions on their lived experiences. One part of this that I think is important is: We really believe in people. And that they can... You don't generally go wrong when you trust that people understand what they care about and what's important to them and you build systems that reflect that.
Part of what I think is going on here, is that people are trying to understand the result of the election. But I do think that there is a certain profound lack of empathy in asserting that the only reason why someone could have voted the way they did is because they saw some fake news.
Right, if you believe that, then I don't think you have internalized the message that Trump supporters are trying to send in this election.
Which is what, what would you say that...
Well, let me finish on this for a second.
The quickest way to refute the fact that this surely had no impact is: Why would you think there would be fake news on one side but not the other? We know, we study this, we know that it's a very small volume of anything. Hoaxes aren't new on Facebook. There were hoaxes on the internet and there were hoaxes before.
And we do our best to make sure that people can report that and so that we can — as I said before — show people the more meaningful content that we can. But you know I think the idea that that had any impact on the election is pretty out there.
I actually don't disagree with that, although it's interesting that several of the tech journalists who happen to be here today specifically mentioned it as something that's of concern to them, so I think they'll be glad you commented on it.
But what about the idea of the idea of a filter bubble, which I think is a bigger concept which I think is one people talk about all the time. I mean I am something of a Facebook expert, as you know, and I am often talking to people about Facebook and I can't tell you how often they want to talk about this idea that they believe passionately, that somehow it is distorting the way the world works.
Yeah. We've studied this a lot. As you can imagine, I really care about this. I want what we do to have a good impact on the world, I want people to have a diversity of information. This is why we study this stuff to make sure we're having that positive impact.
All the research that we have suggests that this isn't really a problem — and I can go into that in a second. For whatever reason, we've had a really hard time getting that out.
Here is the historical analogy that I think is useful for this. If you go back 20 years and look at the media landscape, there were a few major TV networks. In any given local area there were a few major newspapers, that each had an editorial opinion. And those were your opinions that you basically you got all your news filtered through that.
The opinions that you received from the media...
Yeah, through that. Now, regardless of what leaning you have on Facebook politically, or what your background is, all the research would show that almost everyone has some friends who are on the other side. So even if 90 percent of your friends — even if you're a Democrat and 90 percent of your friends are Democrats, you probably have 10 percent of your friends who are Republicans —
That's what your research has found?
Yes, absolutely. Even if you live in some state or some country, you're going to know some people who live in another state or in another city or in another country...
That's encouraging, because a lot of people I know say "I don't know anybody who supports the other person." I happen to, so it's nice. But, I'm glad you said that. Keep going.
I mean I think they probably do. Whether that person is talking about it is a different situation.
So what we found — and you can go through everything, you can go through religion, you can go through ethnic background and all of these different things and that kind of diversity is true. People tend to — even if in most cases or a lot of cases the majority of someone's friends may fit their beliefs — there are always some who don't.
That means that the media diversity and diversity of information that you are getting through a social system like Facebook is going to be inherently more diverse than what you would have gotten from watching you know one of the three news stations and sticking with that and having that be your newspaper or your TV station twenty years ago.
Now the research also shows something which is a little bit less inspiring, which is that, we show, we studied not only people's exposure in News Feed to content from different points of view but also what people actually click on and engage with. And by far the biggest filter in the system is not that the content isn't there, that you don't have friends who support the other candidate or that are of another religion but that you just don't click on it. That you actually tune it out when you see it.
You have your worldview and you go through and I think that we would all be surprised how many things that just don't conform to our worldview that we just tune out.
They just floated down the Feed.
Yeah we just don't click on them. And I don't know what to do about that. You know, I think we should work on that, I think presenting people with a diversity of information I do think is an important problem in the world and one that I hope we can make more progress on.
But right now the problem isn't that the diverse information isn't there, it is — actually by any study it is more there than traditional media in the last generation — but we haven't gotten people to engage with it in higher proportions.