In a report published on Poynter, student researchers at the Duke Reporters' Lab reviewed the work of 37 regional media outlets that fact-checked political claims during the election cycle that ended in November 2016. The most surprising finding was the significant differences in the ways those news organizations presented and organized their fact checks. At least 21 states the Reporters' Lab looked at had a plentiful supply of homegrown, multimedia fact-checking produced by local news organizations, with examinations of more than 1,800 claims by candidates, policymakers and other influential voices in the political process. But some state and local fact-checkers did not create the most basic of landing pages to collect all of their reporting in one place. And those that did build those pages missed other opportunities to make the most of their fact checks’ unusually long shelf life.
Drawing upon a Twitter dataset from the 2012 United States presidential election, this study examines the motives that partisan social media have to share fact checks. Researchers analyzed messages and comments related to fact checks posted on the Twitter accounts of PolitiFact, Factcheck.org and The Washington Post Fact Checker in Oct. 2012, when several debates were took place. Specifically, they looked at 93,578 comments from 55,869 unique Twitter users on 194 original fact checks, coded how each party would perceive the ratings and used a computational approach of users' past tweets to evaluate their political leanings. The study found that fact checks that were positive for the ingroup party were shared more by ingroup members than outgroup members. In short, people shared the fact checks that played into their partisan biases.
Researchers surveyed French individuals online in four regions where the far-right Front National party (FN) had done best in the 2016 regional elections. Respondents were put into one of four groups; the first received false claims on immigration made by Marine Le Pen, the FN's presidential candidate and the second obtained statistics on the same issues. The other two groups were given both or neither, respectively. Across all groups, the researchers tested respondents' understanding of the facts, their support for Le Pen on immigration and their voting intentions. Overall, knowledge of the facts was negatively affected when respondents only read Le Pen's claims but improved when they were offered the facts alone or both the facts and Le Pen's claims. More surprisingly, the intention to vote for Le Pen improved not just among respondents subjected to her claims but also among respondents who were offered the facts alone.