September 4, 2020

After Facebook announced plans Thursday to scale back on political ads and increase voter information ahead of the 2020 elections, fact-checkers offered some additional suggestions for how the tech platform might handle potential misinformation.

In a post, Facebook CEO Mark Zuckerberg wrote the company would start labeling “content that seeks to delegitimize the outcome of the election or discuss the legitimacy of voting methods.” This comes in the wake of a similar post by Zuckerberg in June announcing the company would begin labeling some political content on the platform. director Eugene Kiely advocated Facebook go a step further.

“I think Facebook should provide ‘related articles’ side by side with political ads and include fact-checking articles, if any are available,” Kiely said in an email to the International Fact-Checking Network. “Facebook doesn’t need to deny access to political ads, but it should provide more context and give users the benefit of any additional evidence that supports or refutes the ad.”

Currently, Facebook labels content evaluated by members of its Third-Party Fact-Checking Program, but largely exempts politicians from the scrutiny out of a belief that “by limiting political speech we would leave people less informed about what their elected officials are saying and leave politicians less accountable for their words,” according to Facebook’s Business Help Center page. (Full disclosure: Facebook requires that its fact-checking partners are verified signatories to IFCN’s Code of Principles).

Kiely spoke approvingly of the company’s move to label attempts by politicians to prematurely declare victory on election night, but added, “I’d like to know more about how these posts will be labeled — because, as we know from the third-party fact-checking initiative, labels are important.”

Lead Stories co-founder Maarten Schenk echoed Kiely’s sentiment about Facebook’s labeling. Speaking as part of a panel on labeling at Global Fact 7, Schenk discussed how Lead Stories overhauled its labeling to more clearly communicate its fact-checks.

“Instead of just saying this is false or misleading, we explain why in one or two or three words, and make it stand out,” Schenk said. “And that’s not the case with ads.”

PolitiFact executive director Aaron Sharockman said he understood Facebook’s intent, couldn’t speak to its effectiveness.

“It remains clear to me that Facebook is trying to create a balance between allowing people and politicians to share their views and values, and to stop those same people from spreading harmful, false information,” Sharockman said. “Are they doing it exactly the way I would? Probably not. Would the way I want to do it be better? I don’t know.”

Both Kiely and Schenk agreed that Facebook been more active in its steps to prevent the platform from unduly impacting the 2020 election. Kiely praised the tech company’s efforts to thwart coordinated inauthentic behavior.

“That’s the most important thing it can do: find and remove fake pages that spread misinformation and/or suppress the vote,” he wrote.

Schenk said more could be done to thwart some of these groups, but advised patience. He compared it to the early days of the Third-Party Fact-Checking Program.

“They steadily added more resources, more fact-checkers, better tools, and so right now it’s vastly bigger and working better than ever before,” he said. “And I suspect the same will happen with these other policies.”

Harrison Mantas is a reporter for the International Fact-Checking Network covering fact-checking and misinformation. Reach him at or on Twitter at @HarrisonMantas

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Harrison Mantas is a reporter for the International Fact-Checking Network covering the wide world of misinformation. He previously worked in Arizona and Washington D.C. for…
Harrison Mantas

More News

Back to News