As we learned in journalism school, it’s important to use only the best possible sources, of which studies rank high.
So it’s no surprise that politicians, social media influencers and anyone else with something to sell have longed relied upon “proven by studies” to convince their audience.
The first problem with this is, as an article published this month in Nature by an alliance of 25 researchers declares: “Not all evidence is created equal.” (Full disclosure: the author of this article was one of the co-authors of the report).
The second is that too few news journalists and pundits (and perhaps even fact-checkers) fully understand how to synthesize and use “evidence,” let alone report on it.
Starter questions that journalists often fail to ask – the source of claims, sample size, methodology and funding.
In March, one of the most senior officers in the UK’s Metropolitan Police urged politicians to approve an increase in so-called “stop and search” powers, telling MPs: “We know that stop and search is an effective measure in trying to increase safety, reduce the likelihood that young people will carry knives.”
Rather than check this claim, the Daily Mail newspaper reported it verbatim. However the UK’s leading fact-checking organization Full Fact found otherwise by using research based on 10 years of Metropolitan Police data.
“Higher rates of stop and search (under any power) were associated with very slightly lower than expected rates of crime in the following week or month,” the College of Policing research report found. “The inconsistent nature and weakness of these associated, however, provide only limited evidence of stop and search having acted as a deterrent at a borough level.” (Another full disclosure here: Nerys Thomas of the College of Policing is one of the co-authors of the report published in Nature).
Research published in 2016 by the Home Office, which oversees policing, found that stop and search carried out under so-called section 60 powers (stops without reasonable suspicion) found no effect of reducing knife crime.
Earlier this month, new UK Prime Minister Boris Johnson announced an expansion in the use of the controversial powers.
This is, of course, just one among many examples of media failing to ask basic questions of those making claims. As this 2006 report into years of credulous UK media reporting of false claims by disgraced anti-vaxxer Andrew Wakefield showed, such failures can have terrible real-world consequences.
Reporting single studies causes flip-flopping headlines – and lack of trust in findings
As our article in Nature says, scientists understand that the most reliable evidence comes from what are sometimes called “meta-studies” — comprehensive studies that bring together the evidence from all relevant studies.
Yet as anyone with a taste for chocolate like mine would know, we in the media often prefer to report single studies, step by step, resulting in flip-flopping headlines such as 7 reasons why chocolate is bad for you (Ask Health News), followed soon after by 10 convincing health reasons you should eat more chocolate (Daily Telegraph).
This flip-flopping naturally leads the public to start distrusting findings, even when the consensus is that the key evidence is reliable.
New guidelines for assessing evidence – that journalists (fact-checkers included) could follow
Andy Oxman, research director at the Norway-based Centre for Informed Health Choices and lead author of the report in Nature, brought together experts from a wide range of fields to review “a set of principles for assessing the trustworthiness of claims about what works, and for making informed choices.”
As Andy says: “If you want to inform your audience and avoid misinforming them, think slowly when you report claims about effects. Help your audience to understand how trustworthy the claims are and the size of the purported effects. Understanding and applying these key concepts can help you to do this.”
The principles are not a fixed checklist, but they pose important questions all journalists should ask before choosing to report any claim about effects. Not all evidence is created equal, after all.