Researchers find politicians may fear fact-checkers

In the months before the 2012 election, state legislators in nine states received letters from two political scientists.

“We are writing to let you know about an important research project,” the letters began.

It wasn’t just a letter letting them know about the project — the letters were a core piece of the research, as were the politicians themselves.

Some of the letters informed legislators that PolitiFact had set up shop in their state, and that the researchers were conducting work related to “how elected officials in your state are responding to the presence of this fact-checking organization during this campaign season.” It also told them that, “Politicians who lie put their reputations and careers at risk, but only when those lies are exposed.”

Other state legislators received a less detailed, cautionary letter; it merely informed them about a study of “the accuracy of the political statements made by legislators in” their state.

The correspondence was part of a field experiment published today as a working paper, and as a report for the New America Foundation called “The Effects of Fact-Checking Threat: Results from a field experiment in the states.”

Both are the work of political scientists by Brendan Nyhan of Dartmouth College and Jason Reifler of the University of Exeter. (Nyhan also writes for Columbia Journalism Review.)

I’ve previously written (1,2) about the pair’s research into the effects of fact-checking, misinformation and corrections in politics and public perception.

This latest work offers an interesting look at whether politicians will be less likely to engage in false and/or misleading communications when made aware of the presence of a local PolitiFact affiliate, and the reputational fallout of lying. (PolitiFact is owned by the Tampa Bay Times, which Poynter owns.)

Based on their findings, Nyhan and Reifler say fact-checking could be an effective way to improve the accuracy of political communications. To put it bluntly, fact-checking — or the threat of it — appears to be an effective way to put the fear of lying into politicians.

“We found that legislators who were sent reminders that they are vulnerable to fact-checking were less likely to receive a negative PolitiFact rating or have the accuracy of their statements questioned publicly than legislators who were not sent reminders,” they write in the New America Foundation report. “These results suggest that the electoral and reputational threat posed by fact-checking can affect the behavior of elected officials. In this way, fact-checking could play an important role in improving political discourse and strengthening democratic accountability.”

So, if you warn a politician that professional fact-checkers are operating in their area, and that lies will be potentially damaging to their campaign/career, they are less likely to fib.

Put another way, alerting politicians to the presence of a fact-check operation, and its potential impact on them, could help raise the bar on truth.

It’s PolitiFact as a deterrent.

“Our goal at PolitiFact is to inform people, not to change the politicians’ behavior,” PolitiFact Editor Angie Drobnic Holan wrote in an email to Poynter. “Having said that, we have noticed that politicians are often more careful in their statements if they think they’re going to be fact-checked. I think it’s human nature to be more responsible when you know you’ll be held accountable.”

PolitiFact, Holan wrote, was “not aware of the study when it was going on. We’re happy to see our work having an impact, and we support more local fact-checking. It makes for great accountability journalism, and readers love it.”

Putting the fear into pols

Nyhan and Refler write that their experiment with sending letters fits with previous research that found “elected officials tend to be risk-averse and concerned about threats to re-election, including critical media coverage.” So the warning letters it play into an established mindset/fear for politicians.

For their experiment, they divided just under 1,200 state politicians in nine states into three mostly equal groups:

  1. A “treatment group” that received three different warning-type letters prior to the election. (See a sample below.)
  2. A placebo group that received “a series of letters during the same period noting that we were conducting a study evaluating the accuracy of politicians’ statements that excluded any language about fact-checking or the consequences of inaccurate statements.” (See a sample below as well).
  3. A control group who received no direct communication but were tracked by the research group.

In the end, 23 of the tracked legislators received ratings from state PolitiFact operations during the relevant period. The researchers also searched the LexisNexis news database for articles that addressed the accuracy of any claims made by the politicians. The goal was to see if the treatment group was less likely to make questionable statements during the campaign.

Even though state legislators were, over all, not a frequent target of state PolitiFacts — national politicians get far more attention — Nyhan and Reifler conclude that their warning letters had a measurable effect:

Thirteen legislators who were not sent reminder letters about fact-checking received a negative rating from PolitiFact (1.7%) compared with only three legislators in the treatment group (0.8%). This difference represents a 55% reduction in the relative risk of receiving a negative PolitiFact rating. Similarly, eight legislators who were not assigned to the treatment group had the accuracy of their statements questioned publicly in content indexed in LexisNexis (1.0%) compared with one in the treatment group (0.3%) — a relative risk reduction of 75%.

I asked Nyhan about the relatively small sample size of the PolitiFact ratings, and how that if it was large enough to draw conclusions.

“We think it’s a feature of this research that we found an effect despite fact-checking being so rare,” he told me by email. “We believe that the real-world prevalence of misleading statements is much higher. If our relative risk reduction estimates are accurate, then the magnitude of the real-world effects should be large.”

They’re confident that the research shows the effectiveness of warning politicians. They also believe this potentially expands the value — and target audience  – of an operation like PolitiFact.

“These results suggest that the effects of fact-checking extend beyond providing information to motivated citizens who seek out these websites,” they write. “Given the difficulties of changing the minds of voters or the behavior of candidates at the presidential level, fact-checking other state and federal officials may be a better approach for the movement going forward.”

To make that happen, perhaps PolitiFact’s strategy for entering new markets will be to send letters to all local, state and national politicians to announce their arrival and how they do their work. And to keep sending letters during election time.

Then again, if Nyhan and Reifler’s work is proved out on a large scale, maybe that strategy would mean less work for PolitiFact’s checkers…

Sample letters

Treatment letter example

The Honorable Rodney Ellis P.O. Box 12068
Capitol Station
Austin, TX 78711

August 20, 2012
Dear Senator Ellis:
We are writing to let you know about an important research project.

As you may know, the national fact-checking organization PolitiFact has created an affiliate in Texas. Our research project examines how elected officials in your state are responding to the presence of this fact-checking organization during this campaign season. PolitiFact examines statements made by politicians and then rates their accuracy and truthfulness on a scale that ranges from “true” to “pants on fire.” (Note: We are independent researchers who are not affiliated with PolitiFact in any way.)

In particular, we are writing to notify you that we are studying how elected officials react to the presence of a PolitiFact affiliate in their state. We have enclosed two recent fact-check articles from PolitiFact Georgia as examples of the type of coverage that you might expect to receive if you make a false or unsupported claim.

Politicians who lie put their reputations and careers at risk, but only when those lies are exposed. That’s why we are especially interested in the consequences of PolitiFact verdicts and other fact-checking efforts in your state. Here are examples of the types of questions we are interested in:

  • Are “false” or “pants on fire” verdicts damaging to the reputation or political support of political candidates?
  • Do election campaigns use “false” or “pants of fire” verdicts in their advertising to attack their opponents?
  • Will state legislators lose their seats as a result of fact-checkers revealing that they made a false statement?

Because the legislature is out of session, we are sending this letter to your capitol and district addresses. Over the next two months, we will send you two additional reminder letters about our project so that you will keep thinking about these issues over the course of the campaign. We will seek to contact legislators in your state to discuss these issues further after the 2012 election.

It is essential for the validity of the study to determine whether this letter has reached you successfully. We have therefore enclosed a postage-paid acknowledgment postcard. We would be extremely grateful if you could sign and return the postcard once you have read this letter.

If you have any questions about this study, please contact Susan M. Adams in the Committee for the Protection of Human Subjects at Dartmouth College (cphs.tasks@dartmouth.edu) or Susan Vogtner in the Georgia State University Office of Research Integrity at 404-413-3513.

Placebo letter example

The Honorable Floyd Prozanski 900 Court St
S-417
Salem, OR 97301

August 20, 2012

Dear Senator Prozanski:

We are writing to let you know about an important research project.

We are studying the accuracy of the political statements made by legislators in Oregon.

Because the legislature is out of session, we are sending this letter to your capitol and district addresses. Over the next two months, we will send you two additional reminder letters about our project. We will seek to contact legislators in your state to discuss these issues further after the 2012 election.

It is essential for the validity of the study to determine whether this letter has reached you successfully. We have therefore enclosed a postage-paid acknowledgment postcard. We would be extremely grateful if you could sign and return the postcard once you have read this letter.

If you have any questions about this study, please contact Susan M. Adams in the Committee for the Protection of Human Subjects at Dartmouth College (cphs.tasks@dartmouth.edu) or Susan Vogtner in the Georgia State University Office of Research Integrity at 404-413-3513.

We have made it easy to comment on posts, however we require civility and encourage full names to that end (first initial, last name is OK). Please read our guidelines here before commenting.

  • Bryan

    “While the treatment effect falls just short of significance at the p<.05 (one-tailed) for the negative PolitiFact rating, the effect is in the expected direction."

    Ouch.

  • Bryan

    Another observation: Nyhan and Riefler grouped PolitiFact’s four lowest ratings into one category, calling it a negative rating. It might have been instructive to track whether letter recipients received any rating at all from PolitiFact at a higher rate than those who did not receive a letter.

    The text of the study suggests the researchers didn’t bother to track that stat.

  • Bryan

    Apparently it wasn’t essential to the success of the study for Nyhan’s letter recipients to confirm they had read the letter (contrary to the claim in the letters). The study reports that 21 percent of the experimental group sent a reply, while 34 percent of the placebo group sent a reply.

    Nyhan and Riefler don’t really know whether the letters were read, and so count as the experimental condition whether the letters were sent.

    Why was the reply rate so much lower when the “PolitiFact” name was invoked?

    Discuss.

  • King-Stanley-Krauter

    Fact checking could become both more effective and more efficient if the remaining newspapers would use a divide and conquer strategy.
    —–
    The largest newspaper in each state should publish an annual review of their state’s senior Senator to Washington and his committee assignments. The second largest newspaper would publish an annual review of their state’s junior Senator and his committee assignments. The largest newspaper in every Congressional district that is not investigating a Senator will publish an annual review of its Congressman and his committee assignments. (The other newspapers can investigate a local state Senator.) With this division of labor, there will always be a team of newspapers investigating every committee in Congress.
    ——
    More important, there will also be a large enough group of news junkies and swing voters in every state and Congressional district who will pay close attention to their newspaper’s fact checking. This coordination of reporters and independent voters could overcome the reasons why newspapers are failing to communicate. For example, the pre-crisis journalism on the housing bubble and subprime mortgages was ignored by politicians and regulators because it was forgotten by voters as white noise. And there has been many news reports on our tax laws since the 1986 reforms but nothing was done to stop Congress from creating a new tax deduction for every lobbyist with a campaign contribtuion.
    ——
    It would be easy to make these reviews both profitable and effective but reporters are not interested in communicting more effectively if it means losing their freedom to do what they want to do whenever they want to do it. Reporters have gotten so used to thinking of themselves as the people in the white hat that they have forgotten that white is the color of toilet paper.