Should journalists outsource fact-checking to academics?
Fact-checking, with its reliance on statistics and experts, longer publishing times and occasionally unsexy findings, is in several ways more akin to academia than it is to other parts of journalism.
Some fact-checkers even physically reside within a university: Factcheck.org at the University of Pennsylvania and Africa Check at Johannesburg's Wits University.
Given the broad range of claims politicians make, it is impossible for short-staffed fact-checking operations to employ subject matter experts on every single topic. They usually rely on curious and rigorous generalists who are able to find the key data and studies and then ask experts for a second look. So why not have the academics themselves do the actual work?
In a nutshell, that is what The Conversation does. Born in Australia in 2011, it expanded to the UK in 2013 and now has three additional pilot sites covering Africa, France and the United States. Its model is succinctly summarized in their motto "Academic rigor, journalistic flair." The Conversation has been quoted in the Australian Parliament and is one of the three main experiments in fact-checking born in Australia since 2013 (read about a second one, ABC Fact Check, here).
This approach - academics write, journalists edit - applies not just to its fact-checking work but throughout the content. It has also already been written about.
However, as The Conversation US is about to embark on publishing fact checks soon, following the footsteps of the fact-checking sections in the Australian and UK versions, I return to the model to study its pros and cons.
How it works
"The Conversation has quite a lengthy and structured process," Sunanda Creagh, FactCheck editor for The Conversation Australia, told me in an email. Having seen it spelled it out, I would rate that claim "Entirely True."
The journalists initiate the fact check by finding a claim worth checking and commissioning a relevant academic. This requires finding "one who can be dispassionate in their approach and chat about how they'd tackle it, how quickly they could do it and what data sets they'd use to test the assertion." Once the academic is chosen and agrees, they receive a detailed brief which includes an invitation to "keep it tight -- preferably around 800 words."
Academics are also asked to avoid jargon or emotive language: "This is about educating the audience, not trying to win a 'gotcha' moment." I would add this is probably an invitation academics need less than journalists - and another reason why fact-checking is closer to academia than other parts of the industry.
While the academic writes, the journalists also contact the person who made the statement to ask for any data to support the assertion being fact-checked.
Crucially, the journalists look for a second academic with a similar background to be the blind reviewer. In order to keep the identity of the author unknown to the peer review, the journalists won't "disclose the gender of the main author and try to find a blind reviewer from a different university or state from the main author."
The piece is sent to the blind reviewer. As this passage is the most distinctive part of The Conversation's model, I prefer to quote Creagh in full on how it works:
They read over the draft and check the author 'got it right'. Have they sourced their assertions properly? Did they cherry pick the data? Did they accurately convey the evidence-based consensus among experts or have they gone out into fringe territory? Is there a way to communicate these ideas more clearly? If the reviewer sees room for improvement, they convey that to me and I workshop those improvements with the original academic author. If the blind reviewer is happy with the FactCheck, they write a short one-par review at the end saying something like "This is a sound analysis" and adding any further points they feel are needed.
The fact check is then edited, which "usually means putting a nice lead on it, editing out any opinion or jargon that's crept in. I also put on my 'Devil Advocate' hat." The final draft goes to a senior colleague for "copytasting," i.e. a fresh look at the whole thing again, and then to the academics for a final review.
"If there's anything they're not rock-solid, stand-up-in-court-and-say-it certain about, I ask them to delete it: when in doubt, take it out."
Academics are great - but journalists aren't half bad either
So what do academics bring to the fact-checking?
- Expertise. While experienced fact-checkers are well-versed in what can be fact-checked and how, it's unlikely that they can beat a subject matter expert. As Creagh told me "I have had academics gently suggest to me that, due to their expertise, they can tell me that a certain statement is not really checkable."
- Accuracy. Fact-checkers can take all the necessary precautions, but may not always have the technical skills that are required to double-check certain claims.
As was evident from the process detailed above though, there is still plenty left to do for journalists - and good reason to keep them in the picture.
- Speed. Fact-checking is by nature a slower process than other forms of journalism. Introducing an outside author with many other commitments will inevitably slow it down further compared to running the whole process in-house. Journalists can (a) reduce the work required from the academics in order to get to publication faster and (b) develop an expertise in the topics that are most often subject of fact-checkable claims by politicians (e.g. jobs, immigration) and "farm out" to academics fact checks outside this remit.
- Relevance to the news cycle. A journalist tends to be more attuned to what readers are interested in, or should be in any case. S/he can also more easily spot collaborations like the one The Conversation has established with the TV show "Q&A," which has made the fact-checking more relevant and entertaining. Ultimately this means the fact checks will reach a wider audience.
Where does this leave us in determining the best structure for a fact-checking organization? It is probably too early to tell, with The Conversation's model the only one currently publishing fact checks with this approach - and The Conversation itself doing a lot of things beyond fact-checking.
It is nonetheless an experiment to monitor as it prepares to deploy in the United States, too. I asked Emily Costello, politics and society editor for The Conversation US, whether she intended to adapt the model at all to a more crowded fact-checking scenario than the Australian one.
She doesn't expect to: "No changes to the model. The model is genius."