August 30, 2016

Media criticism of Facebook is pouring in after a lack of editorial oversight on Monday led to a new Trending Topics-related hiccup (in case you missed it, the social network highlighted a totally fake story about Fox News host Megyn Kelly).

While the previous brouhaha erupted over the tricky ethical issue of whether Facebook should balance political perspectives in its highlighted news stories, the current problem seems much more straightforward, with an obvious fix.

Facebook just needs to hire a team of fact-checkers. No need to splurge, either: A team of 10 or so would do the trick.

The fact-checkers would steer away from personal posts and pages. Your crazy uncle’s tirades would be outside the remit of the Facebook Fact Team (that’s your job). But public pages and posts would be fair game. Priority would be given to the pages categorizing themselves in the “Media/News/Publishing” sector and any high engagement or high reach posts.

So what would the Facebook Fact Team do, specifically? Here are three possible tasks:

  1. Weed out sources of information that consistently peddle fake news. Facebook could introduce a “three strikes and you’re out” policy that would decrease the reach of pages that consistently share links to fake stories on their own site.
     
    Once you’ve shared enough fake stories, your reach automatically gets cut to one-tenth of its potential size. As Buzzfeed hoaxbuster Craig Silverman noted recently, fake news websites are experimenting with business models just like legitimate news outlets. Reducing the traffic they’re granted by Facebook would dramatically decrease the incentive to publish fake stories.
  2. Aggressively reduce the reach of posts that are discovered to be false. Facebook is doing so already, but there are at least two problems with the current system.
     
    First, the stimulus originates from readers rather than professional fact-checkers, and readers may be imprecise or have other motivations to flag the content. And Facebook isn’t transparent about how powerful the reach penalty for fake news is, or why de-emphasized stories have been taken down a peg. The current annotation, “many people on Facebook have reported that this story contains false information,” should be spelled out in shiny red letters and made far more prominent.
  3. Spot claims and articles that have been fact-checked elsewhere and offer a “related fact check” underneath the original article just like it currently shares related pages or stories. This could be deployed only in the most extreme cases, where the story was reported entirely false by established fact-checkers.

Is Facebook game? In Rome earlier this week, Zuckerberg repeated that he did not want Facebook to become a media organization. Regardless of whether that ship has sailed or not, hiring fact-checkers does not have to compromise his vision. It would, in fact, reinforce Facebook’s stated “News Feed Values,” which say the network’s experience should be “informative.”

And it would represent a private company responding to concerns from its users about the trustworthiness of the information available on the network. Most importantly, it would have the added benefit of helping reduce misinformation.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Tags:
Alexios Mantzarlis joined Poynter to lead the International Fact-Checking Network in September of 2015. In this capacity he writes about and advocates for fact-checking. He…
Alexios Mantzarlis

More News

Back to News