August 10, 2018

It’s a critique that PolitiFact has long been accustomed to hearing.

“PolitiFact is engaging in a great deal of selection bias,” The Weekly Standard wrote in 2011. “'Fact Checkers' Overwhelmingly Target Right-Wing Pols and Pundits” reads an April 2017 headline from NewsBusters, a site whose goal is to expose and combat “liberal media bias.” There’s even an entire blog dedicated to showing the ways in which PolitiFact is biased.

The fact-checking project, which Poynter owns, has rebuffed those accusations, pointing to its transparent methodology and funding (as well as its membership in the International Fact-Checking Network) as proof that it doesn’t have a political persuasion. And now, PolitiFact has an academic study to back it up.

The report, published by researchers at the University of Washington and Carnegie Mellon University in March, analyzes language that PolitiFact has published about Democrats and Republicans to see how the organization has framed speakers from both parties. It also includes an external analysis that compares how PolitiFact frames issues to how the media at large does.

The report, which PolitiFact first sent to Poynter, used automated text analysis on approximately 10,000 articles dating back to 2007, with the bulk of them between 2010 and 2017. Divided by party, there are about 1.4 times as many articles about Republicans as Democrats. Former president Barack Obama was the most-covered speaker in the sample, with about 600 articles.

Accounting for PolitiFact’s six-point scale, which ranges from “True” to “Pants on Fire!,” there’s a fairly even distribution of ratings. Divided by party, the fact-checking project typically assigns more true ratings to Democrats and more false ratings to Republicans, when adjusted for the difference in the number of articles published about each group.

Screenshot
(Screenshot)

For specific politicians, the most commonly used rating for Obama is “mostly true” while Donald Trump’s is “false,” according to the report. However, when broken up further by the top subject tags on each article, PolitiFact’s coverage appears to be relatively balanced.

Screenshot
(Screenshot)

PolitiFact’s top three most-covered topics are the economy, healthcare and taxes — all of which are about evenly split between Republican and Democratic speakers. And the outliers, such as immigration and guns, could have more to do with how much each party talks about those issues rather than how PolitiFact frames them, according to the report.

“The within-subject split is similar to the overall division between parties,” it reads. “As such, we don’t expect that any differences in language we find between the two parties will be strongly based in the content of the articles.”

So the researchers then broke each article down further by which words were most likely to correctly predict a story was about a Republican or Democrat. To do that, they used statistical techniques and natural language processing models, removing any quotes or other text that the writer did not write.

Given the report’s prior analyses, what they found was fairly unsurprising — articles about Republicans were associated with words like “Republican” and “false,” while articles about Democrats were associated with “Democrat” and “true.”

Screenshot
(Screenshot)

When the researchers removed any words obviously associated with either party, they found that stories with Democratic speakers were commonly predicted by words like “helped” and “think,” while stories about Republicans were predicted by “Fox” and “according.”

Screenshot
(Screenshot)

So what does that say about PolitiFact’s alleged bias? Not much, according to the researchers.

“This part of our analysis finds no obvious differences in the language that is used to describe individuals of each party in a way that shows any indication of bias or differential treatment,” they wrote.

In their external analysis, the researchers didn’t reach any different conclusions. They ran a sentiment classification based on predefined lists of positive and negative terms and found that PolitiFact uses both more negative and positive words for Republicans, articles about which tended to be longer on average.

Screenshot
(Screenshot)

“From this simple analysis, we conclude that, if there is a difference in sentiment conveyed between PolitiFact analysis of Democrats vs. Republicans, it is expressed subtly and or in domain-specific ways,” the report reads.

There are some limitations. Because of the imbalance of articles about Republicans and Democrats, the text classification could only predict with 58 percent accuracy a party from a set of words. Researchers also did not run a more in-depth sentiment analysis on PolitiFact’s content, partly due to the high cost and partly because of the potential for human annotators to respond subjectively to fact check ratings.

But PolitiFact feels good about what the report says.

“It certainly reinforces how we try we to act and carry ourselves. So of course we're pleased to see no red flags,” said executive director Aaron Sharockman in a message. “But as a fact-checker who has scrutinized plenty of research, we also shouldn't give this study more weight than it’s worth. This is one look at the language and words we use to write our fact checks.”

“We can't stop thinking of ways to improve our credibility for all audiences.”

The report is part of a $50,000 grant that the Knight Foundation awarded to PolitiFact last June to improve trust in fact-checking and reach out to skeptical audiences. Last fall, members of the fact-checking organization traveled to three states dominated by conservative voters and held forums, one-on-one meetings and community events to reach out and answer questions. PolitiFact also hired two former politicians (one from each party) to critique its work.

Those efforts were partly inspired by a Rasmussen Reporters survey from September, which found that 88 percent of Trump supporters do not trust media fact-checking. And PolitiFact’s language analysis seems like a good rebuff to those findings.

"Our analyses were not able to detect any systematic differences in the treatment of Democrats and Republicans in articles by PolitiFact,” reads the study’s conclusion.

But in a follow-up email to Poynter, Noah Smith, one of the report’s co-authors, added a caveat to the findings.

“This could be because there's really nothing to find, or because our tools aren't powerful enough to find what's there,” he said.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Daniel Funke is a staff writer covering online misinformation for PolitiFact. He previously reported for Poynter as a fact-checking reporter and a Google News Lab…
Daniel Funke

More News

Back to News