March 2, 2016

“You can be 100 percent factually correct and still be almost useless to your readers.”

Gary Schwitzer’s second sentence in our phone conversation was not exactly a ringing endorsement for fact-checking.

In reality, it shows a challenge fact-checkers face daily. Claims made by elected officials or media types can be accurate but mislead through omission or a lack of context. These claims are sometimes tougher to debunk in a way that is clear to readers, but no less important than ones where the data is blatantly pesky.

Schwitzer is the publisher of and an adjunct associate professor at the University of Minnesota School of Public Health. rates stories in the health beat at top media outlets in America and as of last year, press releases too. It first launched a decade ago and was re-started with new funding after a short hiatus in 2015. Similar websites have operated in Australia, Canada, Germany and Japan, among others.

Stories are rated under 10 criteria, which include to what extent the story flagged the cost and risks of the intervention and the quality of the evidence.

Developments in health are more likely than those in other fields to inspire huge hope or disproportionate dread. When your life or those of your loved ones are possibly in the line, you are more likely to take a leap of faith and believe a poorly-sourced claim about the benefits of a new drug or the risks of an little-known ailment.

(The domain of “wellness”, often conflated with health in media coverage, is also ripe for wishful thinking and sloppy researching, as last year’s chocolate hoax painfully showed.)

Over the years, has claimed some victories in correcting the record.

“Gary is the real deal,” Mary Shedden, news director at WUSF and former editor of Health News Florida, told me on the phone. The site is a particularly useful resource for reporters new to the field trying to learn the ropes, she thinks.

Health beat reporters I spoke to at The Philadelphia Inquirer told me they do consult the site occasionally and have been more conscious of including details about the costs of procedures in their articles because of it.

Still, fact-checkers can’t usually rely on a contact book filled with health experts, let alone imagine to develop the expertise in-house. I asked Schwitzer what some red flags could be for fact-checkers who want to verify a claim about health made by a politician or public figure.

As with other articles in this series of tips for fact-checkers, this list is not meant to be exhaustive, but a starting point.

1. Of mice and men
A lot of stories about new drugs neglect to point out that tests were conducted only on mice. And so a lot of claims that repeat these stories may not bring it up. Finding an effect on mice doesn’t mean the same effect will be seen on humans, and it may be years before those tests are conducted. Schwitzer thinks that simply a passing reference to the lack of human testing is bad practice: This is a crucial factor and should be stressed with the same prominence of the study’s results.

2. Watch out for conflicts of interest
This is true of expertise across all fields. One partial solution is The Conversation’s approach of a prominent and detailed full disclosure of authors’ financial ties. Schwitzer warns, however, that “conflicts of interest are around every corner” in health care. A useful place to start for unattached medical experts is’s list of industry-independent experts, which is set for an update in the near future.

3. Don’t be tempted by the press release and read beyond the abstract
As’s dedicated section shows, press releases can be very misleading — even when they originate from highly reputable academic institutions. While you could expect a press release of a study on the health benefits of potatoes supported by the potato industry to score poorly, perhaps you would be readier to trust one from Oxford University. Yet rated the latter just as poorly as the former. As Schwitzer wrote recently, “at each step in the food chain of the dissemination of research results to the public, different actors in that food chain [allow] the message to be framed in a more positive light than what the evidence would support.” If you’re fact-checking a claim, go as far up as the food chain as possible: Skip the press release and read much more than just the abstract to get the full picture.

4. Beware causal language about observational studies
This too will be familiar to fact-checkers from work in other fields. An association of two phenomena does not imply any form of causation, as Tyler Vigen conclusively and hilariously shows in his blog. Schwitzer notes several stories with this flaw were published between 2010 and 2013 on the effect of coffee, ranging from “coffee can kill you” to “coffee fights skin cancer.”

5. Learn the difference between relative and absolute risk reduction
A 50 percent relative reduction in the risk of contracting a certain disease sounds pretty impressive. But if that reduction means that patients in the control treatment had a two in 100 chance of contracting the malady while patients being studied had a 1 in 100 likelihood, the absolute risk reduction is 1 percent. That is an improvement, but is less impressive than the first datum alone and may make it less reasonable to face the side effects of a treatment. (More on absolute vs. relative risk reduction here).

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Alexios Mantzarlis joined Poynter to lead the International Fact-Checking Network in September of 2015. In this capacity he writes about and advocates for fact-checking. He…
Alexios Mantzarlis

More News

Back to News