Op-eds and editorials may be fact-checked, says Facebook

Third-party fact-checkers will also have two new ratings at their disposal: 'altered' and 'missing context'

Category: Fact-Checking,IFCN

Facebook wants its fact-checking partners and users to know that opinion pieces are not exempt from being fact-checked. Not even if they are framed as op-eds or editorials.

In a statement sent to the International Fact-Checking Network Monday night, the company said it wanted to “clarify some confusion” and “presenting something as opinion isn’t meant to give a free pass to content that spreads false information.”

For many fact-checkers, opinion had always been outside the scope of the Third Party Fact-checking Program — until today. (Full disclosure: Facebook requires that its fact-checking partners are verified signatories to IFCN’s Code of Principles).

In its latest announcement, however, Facebook clarified that policy saying “content presented as opinion but based on underlying false information may still be eligible for a rating.” The company also updated the guidelines in its Publisher Help Center last month.

Chico Marés, a reporter for Brazilian fact-checking organization Agência Lupa, said this announcement clears up what he called a “gray area” in Facebook’s policy.

“Although we, fact-checkers at Lupa, have always understood that this sort of content was eligible for fact-checking, sometimes publishers would see the words ‘op-ed’ or ‘editorial’ as a free pass to misinform. Facebook rules were not clear about that, at least to the general public. So a clearer statement from the company in this regard is certainly a positive move.”

Summer Chen, editor-in-chief of Taiwan Fact-Check Center, in Taiwan, agreed.

“My understanding (until today) was that we could not fact-check politicians’ opinions even if they brought false information. We surely would have liked to have flagged some.” A spokesperson for Facebook clarified that statements from politicians are still exempt from the third-party fact-checking program.

Pablo Medina Uribe, editor at ColombiaCheck, in Colombia, wished this had been clearer a few months ago, when the COVID-19 quarantine started in his country.

“In one of the biggest newspapers in Colombia, they gave a column to a writer, a novelist who is very famous for being contrarian. He used it to spread misinformation about coronavirus and there was nothing we could do about it.”

Jency Jacob, managing editor of BOOM, in India, was surprised by Facebook’s announcement and cautioned fact-checkers to be extra careful with their approach to opinions.

“This is an interesting move from Facebook because of course an op-ed can be written with forged data, but I suggest extra caution. We should only fact-check opinion when data is being misused to make a false point. Don’t raise anger against fact-checkers,” he said.

Two new ratings

Facebook is also adding two new ratings to its Third-Party Fact-Checking Program in an effort to increase clarity about misinformation flagged by partners. “Altered” will signify content that contains manipulated images and videos.

For example, in January, amid escalating tensions between the United States and Iran over the U.S. airstrike that killed Iranian general Qassem Soleimani in Baghdad, a photo of former U.S. President Barack Obama shaking hands with former Indian Prime Minister Manmohan Singh was manipulated to give the appearance Obama was shaking hands with Iranian president Hassan Rouhani. There were thousands of versions of this photograph floating around Facebook to which fact-checkers could have applied this “altered” rating.

“Missing context” is the second new rating, which the announcement suggests should be used when fact-checkers encounter content that would be misleading without additional information. Facebook’s announcement gave the example of a video clip that has been selectively edited to change a political official’s statement from, “I would support this initiative if …’ to ‘I support this initiative.”

Chen, from Taiwan, emphasized these two new ratings will bring more accuracy to the fact-checking service.

“Before the ‘altered’ rating was available, it was hard to say what was false in a Facebook post: the image or the caption that followed it. Or if both of them were false.”

In Colombia, Uribe also celebrated more accuracy.

“When people contact us to complain about a fact-check we published, sometimes they do so based on other things that are not exactly the content we analyzed and rated. So, I think these two new ratings will be good to differentiate that, too.”

Jacob, from India, said he sees this move from a technology perspective.

“The more we, fact-checkers, break down the information and tell Facebook and publishers where the falsehood is, the better Facebook artificial intelligence becomes. While having more ratings is good for fact-checkers to be more accurate, it is also good for Facebook’s AI tools.”

In Brazil, Tai Nalon, director of Aos Fatos, said she likes the new features noting that, “most pervasive misinformation is not blatantly false anyway.” She did raise two different issues:

“There are still some training to be done (with fact-checking partners) and I don’t know when it will be launched globally”.

Keren Goldshlager, who manages Integrity Partnerships at Facebook, said these new changes came from feedback the company got from its partners and community.

“Since we launched our fact-checking program in 2016, we’ve grown our partnerships to include more than 70 organizations around the world. In our ongoing conversations with fact-checkers, many have said it would be useful to have ratings that more explicitly reflect the types of content they’re seeing on our platforms. We’ve added these new ratings to help make sure people have more precise information to judge what to read, trust, and share. We’ve built an unmatched program and are committed to continuously improving it, with feedback from our partners and our community.”

Facebook said that posts rated ‘altered’ will have their distribution dramatically reduced as is already the case with those flagged ‘false.’

For content that’s “missing context,” Facebook will focus on surfacing more information from its fact-checking partners.

Cristina Tardáguila is the associate director of the International Fact-Checking Network and the founder of Agência Lupa. She can be reached at ctardaguila@poynter.org.

Harrison Mantas is a reporter for the International Fact-Checking Network covering fact-checking and misinformation. Reach him at hmantas@poynter.org or on Twitter at @HarrisonMantas