October 21, 2015

External fact-checking is growing worldwide, but what do we know about its effects?

There are several anecdotes: Australian members of parliament (MPs) regularly cite ABC Fact Check; the Democratic Alliance (South Africa’s largest opposition party) responds promptly and in detail to Africa Check enquiries, providing the sources used by its politicians for their claims; Pagella Politica* forced the Vice President of Italy’s Chamber of Deputies to retract a statement on immigrants spreading tuberculosis. Fact-checking sites in the US reach hundreds of thousands of readers, where political elites are distinctly aware that they will receive this scrutiny.

Yet concretely measuring the impact of fact-checking is “one of biggest challenges in this field”, according to Brendan Nyhan, Assistant Professor in the Department of Government at Dartmouth and author of several studies on the impact of fact-checking. “People see deception in politics and think fact-checking has failed”.

Thanks also to the stimulus of the American Press Institute, academic attention to the impact of fact-checking has helped bring the discussion beyond the anecdotal. A standard approach to categorizing the impact of fact-checkers focuses on three different groups: voters, politicians and the media.

Voters
Nyhan and Jason Reifler, Senior Lecturer at Exeter University, found readers in the US welcome fact-checking both in principle and when randomly exposed to the format, though not uniformly. Also, fact-checks seem to work: Emily Thorson, Assistant Professor at GWU, showed that when faced with a fact-check of a previously held misconception, readers do change their minds. The efficacy of the instrument in correcting factual misperceptions does depend however, as both Michelle Amazeen, Assistant Professor at Rider University, and others have written, on the way fact-checks are presented (and indeed corrections can backfire).

Politicians
In a 2013 study Nyhan and Reifler showed that state legislators who were warned that they were going to be fact-checked were significantly less likely to get a negative rating from PolitiFact. On the flipside, we know from a report by Mark Stencel for API that politicians ‘weaponize’ fact-checking by distorting the findings of fact-checkers.

Media
The effect on media is perhaps harder to pinpoint. According to Amazeen the number of mentions of the term ‘fact-checking’ went up 900% between 2001 and 2012 in US newspapers; Nyhan, Reifler and Lucas Graves, Assistant Professor at the University of Wisconsin, note that the practice can be spread by appealing to the professional values of journalists.

More remains to be done, particularly in two directions. The first is to expand the study of fact-checkers’ impact from its current concentration on the USA. To this end, the IFCN and the Duke Reporters’ Lab will collect a systematic database of anecdotes from fact-checkers around the world that we hope will provide a good starting point for further discussion. Moreover, AfricaCheck, Chequeado and Pagella Politica each hope to have an independent impact assessment conducted by the end of the year.

Fact-checkers should also become more creative and test potential new metrics of impact. They should consider what they could find out by exploring the data gathered by platforms like Disqus, Twitter and Facebook. Work by Kate Starbird, Assistant Professor at the University of Washington and by the RumorLens team looks at how misinformation and the related corrections spread on Twitter. Adapting this framework to fact-checking should be possible, though Nyhan warns “this can be very tricky” because fact-checks often tackle complex statements that are difficult to track automatically once paraphrased and fact-checkers are more likely to address a lie once it has been widely spread.

Still, fact-checkers need to start by gathering data on the reach of their work that goes beyond mere traffic figures and make it available to external researchers. A-B testing could also be run with fact-checkers’ headlines to put to practical use the academic evidence that corrections have different uptakes depending on how they are phrased. The effect of fact-checking on other media should be judged – to the extent where this is possible – not merely by the coverage received by fact-checkers but also by the uptake of practices that are specific to fact-checking, starting from their meticulous citation of sources.

Seventy-five fact-checking organizations are currently active according to the Duke Reporters’ Lab database. Though often swamped from their “day job” fact-checking public figures, they should also be actively studying ways to measure their impact.


*Full disclosure: I was Managing Editor of Pagella Politica before joining Poynter and I remain on the board of that editorial project. I don’t intend for this fact to affect my coverage in any way, but I will specify it for transparency purposes any time I cover the Italian fact-checkers on this site.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Alexios Mantzarlis joined Poynter to lead the International Fact-Checking Network in September of 2015. In this capacity he writes about and advocates for fact-checking. He…
Alexios Mantzarlis

More News

Back to News