Researchers have 3 tips to help journalists debunk misinformation

Having the truth on your side is a necessary thing when trying to debunk misinformation.

But it’s far from enough.

The truth alone does not change minds, create belief. Convincing people of your argument, or correcting someone else’s lies, requires more than unearthing the truth and reciting the facts.

So what’s a journalist to do?

Brendan Nyhan and Jason Reifler continue to produce work aimed at identifying the best and most effective ways to combat political misinformation. (I also recently wrote about their research into whether politicians fear fact checkers.)

Nyhan, a professor at Dartmouth, and Reifler, a lecturer at the University of Exeter, today added a bit more to their body of debunking work. They published a research paper, “Which Corrections Work,” with the New America Foundation that contains three specific pieces of advice for how journalists can best correct misinformation. That advice is coupled with related experiments they conducted to reinforce the tips.

Their tips for journalists are not new. As Nyhan told me by email, “This project tests three promising approaches that we previously recommended based on past research in our 2012 New America Foundation report.”

I wrote about that report, and you can download and read the report here. They also summarized the nine tips for Columbia Journalism Review.

Overall, journalists need to understand that how you present your facts is a major factor in effective debunking.

“The key challenge in countering misperceptions is to understand the psychology of belief— why people might believe something that is not true and reject or ignore corrective information contradicting that belief,” they write in the paper.

Part of the difficulty with debunking is the natural tendency for humans to resist correction. We are wired to continue believing the things we think we already know. It’s hard to change someone’s mind, especially if given piece of (mis)information is in line with their core beliefs.

This results in things such as the Backfire Effect, whereby people who are presented with information that goes against their closely held beliefs actually double-down on their views and resist correction. Tell them why they are wrong, and they believe in it even more.

“As humans, we instinctively attempt to explain the events and outcomes we observe,” write Nyhan and Reifler in their paper. “When these inferences are mistaken, they can be very difficult to correct.”

This is why how you phrase your debunking, which sources you choose to help make the case, and the timeline upon which your debunking comes (the earlier the better) all play a role in determining whether it will be effective at countering a rumor or misinformation. Nyhan and Reifler also conducted previous research that found the visual presentation of information can even have an effect of belief.

The new experiments conducted for this paper utilized an online survey of 1,000 respondents. They’re aimed at providing further evidence of the effectiveness of certain corrective techniques. Here’s a look at each experiment, with the related recommendation for journalists.

Tip 1: Choose your sources carefully

For this experiment, they tested whether self-identified liberal and conservative respondents would view corrective information as more or less credible depending on which source it came from.

For example, would a conservative be more likely to believe a news report that goes against their beliefs if it came from a news outlet and third party they identify as conservative?

In their survey, they did in fact find that “Among conservatives, the combination of a liberal news outlet (MSNBC) and a liberal think tank (CAP) was significantly less persuasive than all other combinations of experts and sources.” (There wasn’t a significant effect for liberal respondents.)

Recommendation: “Journalists should seek out experts who are speaking out against a misperception held by their ideological or partisan allies. Corrections by such sources should be more effective than ones from outlets and experts who may be seen as having ideological or partisan motives.”

Tip 2: Offer a “causal alternative” when countering misinformation

Nyhan and Reifler tested whether people are more likely to believe a correction to misinformation if that correction offers an explanation for why the given event occurred.

For this experiment, they told respondents about the (fictional) resignation of a state senator. Different respondents were given different details about the resignation in order to test whether innuendo about a bribery scandal could be countered with additional information about why he resigned.

Recommendation: “When possible, reporters should not just state that a claim purporting to explain an event or outcome is false. They should instead offer an alternative causal explanation that will help readers understand why the event occurred and hopefully displace the previous, mistaken explanation.”

Tip 3: Be positive.

For this experiment, respondents were given background about “Joseph McDade, a former Congressman from Pennsylvania who was tried for bribery but ultimately acquitted by a jury.”

Then they were either told that he was “exonerated” or found “not guilty” to see if the phrasing made a difference in the perception of McDade. In this experiment, they did not find a significant difference in how the two phrases affected the respondent’s view of the Congressman.

“Yes, the results for that study were surprising. Our paper on correcting the Obama Muslim myth took a similar approach and also found limited differences between ‘Christian’ and ‘not Muslim.’,” Nyhan told me (more on that paper from me here). “The negations research that we are drawing on from psychology tested fictional, non-political stimuli; it’s possible that the effects of these sorts of phrasing differences in a political context are smaller than we anticipated.”

Still, the researchers believe that choosing positive assertions is a best practice.

Recommendation: “Stating a correction in the form of a negation may reinforce the misperception in question. Using language that affirms the correct fact is a safer approach.” 

Bottom line: Facts are not enough

When you put this advice together it  begins to form the building blocks for creative convincing counter-narratives to misinformation.

If you want someone to believe something, don’t shower them with facts and stats; craft that supporting material into a story, a counter-narrative that makes sense as a story and makes it easy for the person to remember your main point. Seek out sources who help push that narrative but won’t harm it with partisan baggage

So, yes, do your job to dig up the correct facts and discover the truth. But know that that’s only one part of the debunking process — you also have to work hard to present this information in a way that will affect people.

Finally, if you want examples of what good (and bad) debunking articles look like, Nyhan previously outlined some options, with background, at Columbia Journalism Review.

Nyhan will teach a course on Poynter’s NewsU Nov. 21 called “How to Keep Misinformation from Spreading.”

Related training: Engaging News Audiences for Commercial and Democratic Gains

We have made it easy to comment on posts, however we require civility and encourage full names to that end (first initial, last name is OK). Please read our guidelines here before commenting.

  • shaylynnvacca321

    my Aunty Bailey just got a
    twelve month old Nissan Murano CrossCabriolet SUV by working part-time off of a
    laptop… you can try here J­a­m­2­0­.­ℂ­o­m

  • King-Stanley-Krauter

    Reporters should learn how to communicate from their advertising clients.
    —–
    Their advertising clients use all of the above techniques. More important, they also use repetition to maximize the impact of their propaganda and reporters use repetition to maximize the number of people looking at the advertisements in their newspaper, website, or television broadcast. Which is why reportes are less effective than political advertisements.
    —–
    For example, the pre-crisis journalism on the housing bubble and subprime mortgages was ignored by politicians and regulators because it was forgotten by voters as white noise. There have been many news reports on our tax code since the 1986 reforms but nothing was done to stop Congress from creating at least one new tax deduction for every lobbyist with a campaign contribution. And surveys by the news media,,,, and interviews by comedians with people standing in line to vote, have repeatedly shown that most Americans are too ignorant to vote intelligently.
    ——
    But reporters don’t care about their failure to communicate, even though it would be very easy to overcome, because their financial incentives reward for them for entertaining the public instead of educating voters. And they are more interested in the excitement of communicating like a reporters instead of a teachers.

  • Brent Allsop

    I bet most people will agree that none of these 3 primitive
    tactics will work very good, especially if you want to find new emerging minority
    experts trying, but completely failing to be heard above the bleating popular
    mistaken herd. You simply need to have a
    crowd sourced collaborative consensus building expert consensus system. You need to be able to say that 89% of the
    experts (and growing) think that POV is a mistake, and this is why. And if you want it to be able to convince
    people, you need to allow the people to select their trusted experts any way
    you want to. All of this can be done
    with the consensus building system at Canonizer.com.

    The big problem today, is there is no way to measure the
    reliability of information, other than the primitive ways you talk about here. We simply need to find a way to measure, concisely,
    quantitatively, and in real time for what all the experts believe.