Study: PolitiFact finds Republicans ‘less trustworthy than Democrats’

Center for Media and Public Affairs

George Mason University’s Center for Media and Public Affairs studied 100 PolitiFact fact-checks during President Obama’s second term. The organization “rated Republican claims as false three times as often as Democratic claims,” a press release says.

PolitiFact rated 32% of Republican claims as “false” or “pants on fire,” compared to 11% of Democratic claims – a 3 to 1 margin. Conversely, Politifact rated 22% of Democratic claims as “entirely true” compared to 11% of Republican claims – a 2 to 1 margin.

A majority of Democratic statements (54%) were rated as mostly or entirely true, compared to only 18% of Republican statements. Conversely, a majority of Republican statements (52%) were rated as mostly or entirely false, compared to only 24% of Democratic statements.

“PolitiFact rates the factual accuracy of specific claims; we do not seek to measure which party tells more falsehoods,” PolitiFact Editor Bill Adair wrote in an email to Poynter. “The authors of this press release seem to have counted up a small number of our Truth-O-Meter ratings over a few months, and then drew their own conclusions.” Adair, who is leaving the organization to become the Knight Professor of the Practice of Journalism and Public Policy at Duke University, added:

We’ve rated more than 7,000 statements since we started in 2007. We are journalists, not social scientists. We select statements to fact-check based on our news judgment — whether a statement is timely, provocative, whether it’s been repeated and whether readers would wonder if it is true.


I’ve asked CMPA to see a copy of the study. Last fall, CMPA said PolitiFact rated “statements by Mitt Romney and other Republicans as false twice as often as statements by President Obama and other Democrats.”

Poynter owns the Tampa Bay Times, which owns PolitiFact. PolitiFact has proven to be a reliable punching bag for critics on the political left and right. Rachel Maddow has repeatedly lobbed criticism at the fact-checking organization. Last summer, Virginia GOP strategists said the site’s Virginia operation “has ruled disproportionately against Republicans and in favor of Democrats.” PolitiFact Virginia Editor Warren Fiske wrote in response that Republicans’ dominance of Virginia politics would explain any disparity.

Eric Deggans wrote earlier this month that PolitiFact’s trademark “Truth-O-Meter can provide an easy source of criticism,” while the site’s method of showing how it reached a conclusion “lets the reader make his or her own decision.”

Previously: GOP strategists review PolitiFact findings, say they ‘back up our argument about media bias’ | Adair: “we’re early in the factchecking revolution” (CJR)

We have made it easy to comment on posts, however we require civility and encourage full names to that end (first initial, last name is OK). Please read our guidelines here before commenting.

  • http://www.storyworldwide.com/ Kirk Cheyfitz

    Randy, I appear to share your ideological bias about who’s lying about more important stuff. But the selection bias (as Bryan has reminded me to call it) of Politifact may simply reflect our bias by judging that right-wing lies are more “newsworthy” or, as you say, more important than left-wing lies. I am personally convinced that an unbiased study looking at all left- and right-wing factual assertions would show that the right is, in fact, less truthful than the left. But that’s not this study.

  • Bryan

    How many pseudonyms do you have, Karen?

  • http://twitter.com/SherlockianOne Candace Drimmer

    Maybe Rush can get more Viagra commercials to perk him up?

  • http://www.storyworldwide.com/ Kirk Cheyfitz

    Thanks for the terminology. I wasn’t concerned, actually, with what questions the researchers were “trying to answer” as I was with what questions they could possibly answer based on the data available. I agree that Poynter’s headline appears to be unfortunate.

  • http://twitter.com/ZebraFactCheck Bryan W. White

    Peter Roff (U.S. News & World Report) understands what the George Mason study’s about.

    Read it.

    http://www.usnews.com/opinion/blogs/peter-roff/2013/05/28/study-finds-fact-checkers-biased-against-republicans

  • http://twitter.com/ZebraFactCheck Bryan W. White

    Kirk,

    You’re sort of on the right track, but we should be careful about jumping to conclusions about what question the researchers tried to answer.

    Adair makes a good point about selection bias (the name for what you’re talking about). But on other occasions Adair doesn’t appear to have a clue about selection bias. Take PolitiFact’s candidate “report cards,” for example. Does PolitiFact offer warnings about drawing conclusions about candidates based on those biased samples?

    It’s more likely the researchers are looking for clues about the impact of selection bias on PolitiFact’s findings by party. Which makes it hilarious when Poynter publishes a headline like the one above, which gently nudges readers to draw their conclusions about the supposedly dishonest Republicans instead of about PolitiFact’s selection bias.

    The press release makes it clear that it finds the results somewhat counterintuitive with the Obama administration embroiled in allegations of giving inaccurate statements to Congress and the like. Poynter buries that angle, and the Huffington Post followed suit not long ago.

  • http://www.storyworldwide.com/ Kirk Cheyfitz

    Bill Adair makes a very important point that may not have been explored sufficiently in this post: The George Mason study seems to conflate two very different kinds of inquiries. On one hand, you could set out to rate every assertion of fact made by persons or groups in order to reach a conclusion about their relative truthfulness. On the other hand, you coud set out to select certain kinds of assertions of fact (for example, those that you think are newsworthy) by persons or groups in order to determine solely whether the selected assertions are true or false. In the latter case, it would be impossible to draw any conclusions comparing the truthfulness of the persons or groups being studied.

    So George Mason, in effect, has done the impossible: They have imputed conclusions about something (relative truthfulness) that Politifact says it has no data about.

    I hope that made sense. It’s a huge distinction in my mind.