June 21, 2018

ROME — During the second day of the fifth annual Global Fact-Checking Summit on Thursday, the world’s biggest social network shared some news about its most visible effort to counter misinformation.

During a one-hour question-and-answer session in the morning, Tessa Lyons, a product manager at Facebook, gave an overview of the company’s fact-checking program — which allows fact-checkers to debunk hoaxes on the platform, decreasing their reach in News Feed by about 80 percent. The program has grown to 25 fact-checking outlets in 14 countries. (Disclosure: Being a signatory of the IFCN’s code of principles is a necessary condition for participation in the project.)

Near the middle of the talk, Lyons announced several new updates to the efforts Facebook is taking to weed out misinformation on its platform, where hoaxes regularly outscale fact checks.

First, Lyons said that Facebook is now employing natural learning processing systems to detect duplicate fake news stories on the platform that fact-checkers have already debunked. That’s expected to cut down on the volume of hoaxes, which are often copied and pasted from previous fake news stories that have been down-ranked in News Feed.

“What we’re doing now is taking the false rating we get from fact-checkers, using natural language processing to identify all of the duplicate articles. And we’re being precise about this so we’re trying to avoid errors that are false positives, and therefore we’re probably missing some, but this is where we’re starting,” she said. “And once we find those duplicate stories were able to reduce all of their spread.”

Tessa and Alexios
Alexios Manzarlis, director of the International Fact-Checking Network, questions Tessa Lyons, product manager at Facebook, during the Global Fact-Checking Summit in Rome on Thursday, June 21, 2018. (Photo/Giulio Riotta)

To test that new system, Lyons said Facebook looked at a false rating that a fact-checking project in France assigned to a story that claimed stroke victims could be helped by using a needle to prick their fingers. In that case, her team found a single hoax had been shared from more than 1,000 different URLs and 20 different domains to trick Facebook’s detection systems.

“You all know that it takes more time to fact-check false stories than it does to create them,” she said. “So we need to think about how we can scale the impact of fact-checking by expanding the actual actions that we’re able to take.”

Also related to that development, Lyons announced that Facebook will start taking more aggressive actions against pages that consistently share misinformation. The company already reduces their distribution and removes their ability to advertise and monetize — but now it will use machine learning to predict pages that are likely to spread financially motivated misinformation based on similar pages fact-checkers have already identified.

That action, which is described in a concurrent Facebook press release as aimed to “help curb foreign interference in public discourse,” targets the types of false pages that were created in countries like Macedonia to influence the 2016 U.S. presidential election.

Third, Lyons announced that Facebook would start using the Schema.org ClaimReview markup, which allows Google to surface fact checks high up in search results. She said that is expected to cut down on the amount of manual labor that fact-checkers must do to get their ratings onto Facebook, while also getting ratings into the system faster, thereby decreasing the amount of time it takes to limit a hoax’s spread.

“We want to reduce the amount of administrative overhead that you have,” she said. “You having to put a ClaimReview markup on your reference articles and then also put them into our tool is bureaucratic and silly for us as a technology company.”

In one example, Lyons said a fact-checker rated a rumor about Reese’s candy going off the market false in a ClaimReview markup several hours, if not days, before it was finally put into Facebook’s system.

“That’s a failure on us to fail to predict and prioritize, but also an example of — if we were just better using ClaimReview, which we’re going to do going forward — we’d reduce that amount of time,” she said. “Things go viral so quickly that any hour we can save is critical.”

Finally, Lyons announced that Facebook has expanded its test that allows fact-checkers to debunk both manipulated and out-of-context photos and videos on the platform to France, Ireland, Mexico, and India. Prior to February, when the tech company gave Agence France-Presse the ability to debunk those kinds of hoaxes, fact-checking partners were only able to debunk fake news stories with hyperlinks.

The flurry of announcements at Global Fact comprise the most notable news Facebook’s fact-checking program in more than six months.

In December, BuzzFeed News published a story based on a leaked email to Facebook’s fact-checking partners. It was the first to report that, once fake news stories are flagged by fact-checkers, their reach is decreased in News Feed by around 80 percent after three days. Between March and April, Facebook expanded its fact-checking project to 10 countries — all outside the West.

But Thursday’s announcements weren’t the first pieces of news that the platform shared at Global Fact.

On Wednesday, Facebook user experience researcher Grace Jackson gave a presentation on how the company has researched the receptivity of users to fact checks on the platform. The aim was to figure out what was going well for them and what was not.

What Jackson said she found was that, since people often scroll really quickly and get distracted while using Facebook, they often make incorrect assumptions about posts — particularly those that had been labeled as fact checks. So Facebook tweaked the feature slightly to include ratings in headlines and a "fact-checker" badge.

Overall, fact-checkers at Global Fact had pretty positive views of Facebook’s new announcements.

“Using machine learning to identify hoaxes and conspiracy theories to reduce their spread is groundbreaking. It’s a new level of sophistication for slowing the spread of false information,” said Angie Holan, editor of (Poynter-owned) PolitiFact, in a message. “The Facebook partnership has been very powerful and innovative in identifying potentially false information that needs fact-checking. I’d like to see more platforms launch similar programs.”

The addition of ClaimReview in particular could really help some fact-checkers in the long run. Still, some felt that, despite its transparency during Global Fact, Facebook still didn’t give a enough insight into how the fact-checking program is working overall.

“ClaimReview is a great thing if they are plugging. They seem to be pretty proactive,” said Govindraj Ethiraj, editor of Boom Live — an Indian fact-checking project that Facebook temporarily partnered with for last month’s election — in a message. “The part about duplicates, etc. was useful. Broadly, lots of small but important changes and improvements, but no big-picture insights.”

And other fact-checkers from countries outside the West were skeptical that Facebook could actually address the problems that they run into on a daily basis.

"It can help us not spend too much time flagging like 40 contents inside Claim Check," said Tai Nalon, director of Aos Fatos — a Facebook fact-checking partner in Brazil — about ClaimReview in a message. "However, we are not that sure if these changes are really happening or what changes they are actually making."

“I understand that they have the best intentions by improving Facebook’s environment, but each country has its own issues. I think they still are too Menlo Park-centered.”

Editor's note: This story has been updated with the names of the four countries where Facebook is now allowing fact-checkers to debunk images and videos.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Daniel Funke is a staff writer covering online misinformation for PolitiFact. He previously reported for Poynter as a fact-checking reporter and a Google News Lab…
Daniel Funke

More News

Back to News