May 9, 2016

Gizmodo ignited a social media firestorm today when it reported that curators for Facebook’s “trending” section regularly put the kibosh on right-leaning news stories.

The story, attributed to anonymous sources, was notable in part because of Facebook’s status as a media behemoth that directs a firehose of traffic at news organizations every day. But it also made waves because Facebook has taken pains to cast itself as a nonpartisan intermediary for news, a company that serves up stories without a slant. In a response to Gizmodo’s story, Facebook said it was taking allegations of bias “very seriously” and reaffirmed its commitment to providing views “across the political spectrum.”

In little more than a decade, Facebook has gone from a dorm room startup to one of the largest distributors of news and information worldwide. So how should we react when one of the most important middlemen for journalism is revealed to be composed of human beings with human biases? Below is a question-and-answer session with Kelly McBride, vice president of The Poynter Institute and its media ethicist, on what today’s report means for the news industry.

If Gizmodo’s claim is accurate and Facebook’s news curators were deliberately avoiding conservative news, does this pose a problem? If so, why?

This is a problem because it demonstrates the ability to manipulate the marketplace of ideas from behind the scenes. That’s nothing new, but it undercuts social media proponents who say Facebook is a more democratic news distribution forum.

We’ve known for a long time that algorithms have biases programmed into their selection process. But this claim that humans were overriding the selection process undermines the whole idea of an algorithmic news lists and feeds. It suggests to me that Facebook didn’t have complete faith in their user base. This is especially true when you consider the anonymous source’s claim that the team was instructed to find a more neutral source for information that was trending based on initial reporting from a conservative site.

I suspect that there is more information about this practice, described by unnamed sources, that will help us understand what Facebook was trying to accomplish. Because it doesn’t make a lot of business sense for Facebook to secretly manipulate trending topics based on political viewpoint.

Wouldn’t that alienate a large portion of their users? There have to be more details that would help us understand this. Maybe, as one source speculated, they were refining the algorithm?

News organizations obviously have biases about what they choose to feature. Is Facebook’s preference here really all that different from a newspaper, TV station or digital startup?

This is a little harder for the audience to discern, so it’s less transparent and therefore different. It’s easier to spot a human bias when you know to look for one. Until now, we thought the trending topics team was just writing headlines and descriptions, not actually influencing our understanding of what is trending.

This story reveals that Facebook’s recommendations aren’t entirely algorithmically driven — they’re made in part by humans, with human preferences. Are there steps Facebook can take to reduce the appearance of bias in its selections?

If you want the audience to trust you, transparency is always the answer whether you are talking about human bias or machine bias. That’s easier said than done, in both cases. It’s hard for humans to accurately describe their biases. And algorithms need a bit of opacity if they are going to be resistant to manipulation.

If Facebook could tell us more about what the algorithm privileges and what the algorithm dismisses, that would be helpful. If Facebook could also tell us what the humans do and don’t do, we would have more information than we do now and more of a reason to trust what we see.

Facebook is a huge determiner of the news we see online. Does it have a responsibility to be transparent about how it’s serving us stories?

As the No. 1 driver of audience to news sites, Facebook has become the biggest force in the marketplace of ideas. With that influence comes a significant responsibility. There are many ways that Facebook could live up to that responsibility. In the past, Facebook has affirmed that it recognizes its democratic obligations. But it can always do more.

Are there any other steps Facebook could take?

Here’s a crazy idea: What if Facebook (and other companies that have clear ability to influence the marketplace of ideas) had a public editor, like The New York Times does. That person would be able to research and write about the company from the public’s point of view, answering questions and explaining the values that drive certain decisions.

Poynter’s managing editor, Benjamin Mullin, contributed to this report.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Kelly McBride is a journalist, consultant and one of the country’s leading voices on media ethics and democracy. She is senior vice president and chair…
Kelly McBride

More News

Back to News