June 4, 2013

PolitiFact Editor Bill Adair was careful to call a study that claimed his shop “rates Republicans as less trustworthy than Democrats” a “press release” when I asked him for comment about it last week.

“The authors of this press release seem to have counted up a small number of our Truth-O-Meter ratings over a few months, and then drew their own conclusions,” Adair wrote. (Poynter owns the Tampa Bay Times, which owns PolitiFact.) I asked the spokesperson for George Mason University’s Center for Media and Public Affairs for a copy of the full study, about which I had indeed received a press release. In return, CMPA spokesperson Kathryn Davis sent me the following tables:

(Click to view these bigger)

CMPA combined “Mostly False,” “False” and “Pants on fire!” ratings “into a single ‘dishonesty’ rating,” Davis told me.

The press release, she said, “is the study and announcement combined.”

CMPA President Robert Lichter told me in an email CMPA “does publish more extensive research reports as warranted.”

The short-term studies, which emphasize brevity and timeliness, usually appear first on the CMPA website. Eventually they are combined/collected into larger data sets and subjected to additional data analysis, producing results that are published in scholarly journal articles and books.

Here’s a page with CMPA studies listed.

Dr. James Wright of the University of Central Florida’s sociology department is also the editor-in-chief of the journal Social Science Research. He told me via email that press-release studies are “frowned upon in academic circles.”

Wright said that for this study, he’d like to see “a much longer time series of data.” Looking at CMPA’s tables, he noted that in March “Dems lied more than Republicans.”

Is the overall trend for these 5 months indicative of a long-term pattern? Or is it a temporary aberration? Would other five month series show the same thing? Does the pattern vary by which party controls the White House?

He also said he’d prefer a “larger sample size. 100 cases is not much. 500 would be better. How were these 100 cases chosen?”

I attempted to duplicate CMPA’s results myself: I counted all PolitiFact’s national rulings between Jan. 20 and May 22, 2013, arriving at 113, 17 of which were either clearly or arguably not regarding the statements of politicians. Three were by National Rifle Association Executive Vice President and CEO Wayne La Pierre.

In another email, Lichter said the study “did include assertions from groups aligned with one of the parties, which are picked up by party supporters in current policy debates. In this study period, which of course featured a debate over gun control legislation, that means statements by…Wayne La Pierre were included.”

CMPA says it will next turn its attention to “the Washington Post Fact-Checker’s ratings.” That feature is written by Glenn Kessler, whose work CMPA has previously compared to PolitiFact’s. “My basic principal is that politicians in both political parties will stretch the truth if they think it gives them an edge,” Kessler said in an emailed statement.

I always strive to be fair-minded and nonpartisan in evaluating claims-and aim to be consistent in applying the Pinocchio ratings. Depending on the time period studied, the results may favor one party or another, simply because I decide what claims to study based on newsworthiness and overall reader interest.

Related: Why Fact-Checkers Find More GOP Lies (The Atlantic Wire) | Is Politifact Biased Against Republicans? (The Monkey Cage) | Who’s Checking the Fact Checkers? (U.S. News & World Report)

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Andrew Beaujon reported on the media for Poynter from 2012 to 2015. He was previously arts editor at TBD.com and managing editor of Washington City…
Andrew Beaujon

More News

Back to News

Comments

Comments are closed.