May 3, 2016

Facebook poses two key challenges for the transmission of factual information.

The first and most obvious is the lack of editorial oversight and the seamless distribution of what is popular regardless of whether it is accurate. This has meant fakes can reach millions of eyeballs, while the wording of Facebook posts can make months-old articles resurface as current news.

Facebook has been testing solutions to address this issue — if for no other reason because users overall don’t seem to trust news they see on the network a great deal.

Since January 2015, for example, the company reduced distribution on the News Feed of posts that many users flag as a “false news story.” Stories downgraded in that manner are also tagged with a warning for subsequent readers.

Still, Facebook will act only when an (unspecified) amount of readers recognize a fake. Moreover, a BuzzFeed analysis has shown results have been underwhelming.

One next step may be to turn the annotation of fake news on Facebook into a more elaborate tool that allows users to specify what is fake or misleading in a post, as Tom Trewinnard of Meedan noted last year.

This tool could serve not just to abate the spread of fake news but to prevent it. If a user is about to post a link that has been annotated as fake, they could be alerted to the community’s doubts about it before posting. The user would be free to ignore the warning — but this would likely result in reduced sharing.

This warning sign, a “check this first” alert, was suggested by Brad Scriber of National Geographic at the Tech & Check conference in March.

In a statement, Facebook said it’s “always working to improve” the experience on the social network.

“We have heard from people that they want to see fewer hoaxes in News Feed, so are actively working on more ways to reduce their prominence in feed,” a spokesperson told Poynter.

The second challenge is that Facebook’s News Feed isn’t built in a way that makes it easy to deliver corrections to audiences that have consumed inaccurate stories.

Brutally simplified, Facebook’s algorithm prioritizes delivery of content similar to what you have clicked on, liked and shared in the past.

This means News Feed turbocharges our natural tendency towards spotting and trusting information that confirms our existing beliefs — ultimately leading us into ever-more homogeneous echo chambers.

These may be “fact deserts” like the conspiracy theory-addled echo chambers spotted in research by Walter Quattrociocchi, a computer scientist at the IMT School for Advanced Studies in Lucca, Italy and others (see here, here, and here). In these echo chambers, users will consistently share one of two qualitatively different narratives: either unsourced rumors or content with a verifiable author and source.

An increasingly complicated News Feed algorithm makes this hard to assess, but it is also probable that the social network is creating exclusive “fact clubs,” where other sides’ arguments are almost entirely shut out of your News Feed. As a foreign millennial living in the United States, I have yet to see a single post supporting Trump on my News Feed, for example.

(Less anecdotally, research on the “homophily” of our friend groups conducted by Facebook itself show about 23 percent of people’s friends claim an opposing political ideology. These numbers have been challenged by Zeynep Tufekci, a sociology professor at UNC.)

A fact check that rates a Hillary Clinton claim “false” is more likely to be shared and liked by opponents of the Democratic frontrunner than her supporters. This will raise its visibility on News Feeds of their similar-minded friends, reaching audiences who may not need a cue to go verify Clinton’s claims but eluding those who most need to see the fact check.

A video Facebook published in April on the U.S. presidential election seems to implicitly recognize this problem. The video, which has almost 3.4 million views at the time of writing, invites users to “find the other side of the story” and to be open to “the .003 percent chance that we might be wrong.”

This is likely as much an attempt by Facebook to promote use of its search bar as a genuine effort to burst echo chambers.

Regardless of the company’s motives, the approach does seem unique: We don’t expect Mozilla to warn us when we’re on a fake news site, nor our local newsstand to advise us to pick up a newspaper with a different political leaning from the one we usually buy.

Ultimately, if users don’t care about diversifying their news diet, should Facebook?

Partisan consumption of news and propagation of fake information predated Facebook and will undoubtedly outlive it. The challenge of viral fakes and echo chambers is, however, one that Facebook has made quantifiable in ways that weren’t quite possible before.

Combating this challenge will require both a broad coalition and a different approach from the company.

Facebook currently changes things if they are not working for users and provides publishers with insights into what does and doesn’t go well with their audiences.

What fact-checkers and debunkers would actually need is information on what works with audiences that do not already follow them; audiences that are more likely to have consumed the inaccurate claim or fake news in the first place.

Quattrociocchi says an effective approach could be to evaluate how penetration of fact checks and debunks in different echo chambers varies with slight variations in the tone and content of the post. Greater investment in this space will be necessary, he adds. “There is a lot of buzz around fighting misinformation, but funding hasn’t always followed good intentions.”

Ultimately, changes to News Feed are user-driven, a company representative said. If enough of the users Facebook tracks and surveys on a semi-permanent basis decide that something is a problem, the company will look for ways to fix it.

It is not clear, however, that a solely user-driven process is up to the challenge.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Alexios Mantzarlis joined Poynter to lead the International Fact-Checking Network in September of 2015. In this capacity he writes about and advocates for fact-checking. He…
Alexios Mantzarlis

More News

Back to News