February 5, 2018

Facebook and its fact-checking partners will meet in Silicon Valley Tuesday to work out their differences — and the latter have some pretty specific asks.

The meeting (brokered by the International Fact-Checking Network) will take place at the technology company’s Menlo Park, California, headquarters and involve fact-checkers from organizations like Libération, Snopes and Factcheck.org. It comes more than a year after Facebook launched the project, which enables independent fact-checking outlets to review viral stories on Facebook and, if false, append a related fact check.

Angie Holan, editor of PolitiFact (a project of the Poynter-owned Tampa Bay Times), told Poynter that, overall, the outlet has benefitted from the Facebook partnership mainly in its ability to more rapidly detect viral hoaxes. But at Tuesday’s meeting, she hopes the tech company will share how the flagging mechanism is affecting news consumption.

“We’ve been really pleased with the partnership because it’s helped us identify the fake news really easily,” said Holan, who will be represented by executive director Aaron Sharockman at the meeting. “I hope to hear from them how they think it’s going as far as reducing the spread of fake news and informing people on Facebook.”

The timing of the meeting at Facebook is key. Last week, the tech company announced that it was expanding its project to Italy ahead of next month’s election — the fifth country to join the initiative. The platform has come under fire in recent weeks for several changes it’s made to the algorithm powering the News Feed, including decreasing the reach of publishers and pages in favor of personal interactions, as well as boosting the visibility of outlets readers say they trust most.

The four fact-checkers that Poynter spoke to mostly agreed they would like more transparency from Facebook, which, as Politico pointed out last month, seems set to open up its notoriously walled gardens a little at the meeting. Eugene Kiely, director of Factcheck.org, echoed Holan’s call for more information about how the flagging mechanism is working.

“I’d like to get a better understanding of the impact that the fact-checkers are having on reducing the dissemination of fake news, and how Facebook measures that,” he told Poynter in an email. “And I’d like to improve communications with Facebook. Too often Facebook makes major changes that directly affect us without any discussion or advance notice.”

In October, BuzzFeed News published a report based on a leaked email from Facebook to one of its fact-checking partners on what is to date the only data about the project. According to the email, the future impressions of stories that its fact-checking partners label as false decrease by 80 percent in the News Feed — but it often takes an average of three days for them to be flagged.

In a Poynter article published in December, Facebook’s fact-checking partners weighed in on how they thought the project was going one year after its launch. PolitiFact and Factcheck.org said that, while the methodology and timeline for flagging fake new stories on Facebook isn’t perfect, the endeavor had been largely successful. But others doubted the effort’s premise entirely.

One of those fact-checkers is Brooke Binkowski — and she’s not hopeful for tomorrow’s meeting.

“Given that it has been more than a year and we are still struggling with just as much fake news and disinformation, if not more, than we were a year ago, I remain skeptical that any algorithmic tool — even with humans behind it like us — is going to help anything in a meaningful way,” said the managing editor of Snopes, who won’t be attending the event. “I think that literally the only thing that’s going to help is an army of moderators.”

David Mikkelson, founder and editor of Snopes, who is attending the Facebook meeting, told Poynter in an email that, while there are real challenges to ensuring the tool is effective, he’s a little more optimistic about the project going forward.

“I’d say we're all looking for ways to make the Facebook fact-checking partnership work better and more efficiently for both sides,” he said. “It's a rather large and complicated problem to tackle, and it makes sense for all of us to be talking with each other and with Facebook as a group rather than one-on-one.”

Beyond transparency, another key area of improvement that fact-checkers identified is the ability to apply fact checks to copycat fake news stories. Oftentimes one viral hoax will be republished across several fake news sites, accruing more reach than the original flagged story — but the fact checks don’t always follow them, both Mikkelson and Kiely said.

And, going forward, Holan is optimistic.

“They’ve made some significant improvements to the tool this year,” she said. “Because of those improvements, it allows us to identify material that needs to be fact-checked more quickly and to connect our fact checks to inaccurate information.”

“I think it’s going to be even better this year.”

Correction: This article has been corrected to more accurately convey the content of the leaked Facebook email obtained by BuzzFeed on the effect on reach of false news stories spotted by fact-checkers. The email stated "once we receive a false rating from one of our fact checking partners." We wrote that two fact-checkers had to weigh in.

This is because while Facebook has not disclosed the actions it takes on links rated as false news in detail, it only moved from requiring two fact-checkers for a disputed flag to one fact-checker for a related article in December, after the email in question.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Daniel Funke is a staff writer covering online misinformation for PolitiFact. He previously reported for Poynter as a fact-checking reporter and a Google News Lab…
Daniel Funke

More News

Back to News