fact-checkers

Poynter Results

  • On Twitter, stories rated 'false' spread faster and wider than those rated 'true'

    In “The spread of true and false news online,” Soroush Vosoughi, Deb Roy and Sinan Aral, all at the Massachusetts Institute of Technology, studied a huge sample of tweets about fact-checked claims published over the course of more than a decade. They found that stories rated “False” spread faster and wider than those rated “True.” The researchers shied away from making their own determinations of the veracity of online content, leaning instead on the findings of six fact-checking and debunking websites (some well-known, others less so). Stories that no one has fact-checked — because their truthfulness is not presumably up for speculation — were not part of the study’s sample.

    Study Title
    The spread of true and false news online
    Study Publication Date
    Study Authors
    Soroush Vosoughi, Deb Roy, Sinan Aral

    Keywords:

    Journal
    Science
    Peer Reviewed
    Yes
    Sample
    Representative
    Inferential approach
    Experimental
    Number of studies citing
    1
  • Fact-checking in Europe is growing, diverse and fragile

    The report found 34 permanent sources of political fact-checking active in 20 different European countries. Through interviews with the founders and leaders of these projects, the Reuters Institute for the Study of Journalism dissects the main challenges that fact-checkers face in Europe. Most of the projects are based in NGOs, not traditional media, the report finds — and funding is a key concern for most.

    Study Title
    The Rise of Fact-Checking Sites in Europe
    Study Publication Date
    Study Authors
    Lucas Graves, Federica Cherubini
    Journal
    Reuters Institute for the Study of Journalism
    Peer Reviewed
    No
  • Video or text? Evaluating the relative efficacy of different formats for fact-checking.

    The research team, which included Factcheck.org co-founder Kathleen Hall Jamieson, presented a sample of 525 online respondents with a deceptive claim on the Keystone XL pipeline included in a political flyer. They were then presented either (a) a textual fact check of the claim (b) a humorous video fact-checking that claim (c) a non humorous video fact check (d) an unrelated humorous video of a baby singing (e) nothing at all. Belief in the deceptive claim fell more significantly among participants who viewed either fact-checking video than among those who read the article.

    Study Title
    Fact-Checking Effectiveness as a Function of Format and Tone: Evaluating FactCheck.org and FlackCheck.org
    Study Publication Date
    Study Authors
    Dannagal G. Young, Kathleen Hall Jamieson, Shannon Poulsen, and Abigail Goldring
    Journal
    Journalism & Mass Communication Quarterly
    Peer Reviewed
    Yes
  • American fact-checking suffers from a lack of presentation

    In a report published on Poynter, student researchers at the Duke Reporters' Lab reviewed the work of 37 regional media outlets that fact-checked political claims during the election cycle that ended in November 2016. The most surprising finding was the significant differences in the ways those news organizations presented and organized their fact checks. At least 21 states the Reporters' Lab looked at had a plentiful supply of homegrown, multimedia fact-checking produced by local news organizations, with examinations of more than 1,800 claims by candidates, policymakers and other influential voices in the political process. But some state and local fact-checkers did not create the most basic of landing pages to collect all of their reporting in one place. And those that did build those pages missed other opportunities to make the most of their fact checks’ unusually long shelf life.

    Study Title
    Plenty of fact-checking is taking place, but finding it is another issue
    Study Publication Date
    Study Authors
    Mark Stencel, Rebecca Iannucci
    Journal
    n/a
    Peer Reviewed
    No
    Sample
    Representative
    Inferential approach
    Experimental
  • ‘Fact Check This’: How U.S. politics adapts to media scrutiny

    This report goes deep into the main ways American politicians have reacted to (or ignored) major fact-checkers' work, concentrating on the period 2010-2014. The author finds that U.S. politicians use fact checks to validate their arguments or undermine their opponents. Politicians also stand their ground or attack the fact-checkers directly to reduce their credibility. With examples from across the political spectrum, the study offers a relatively rare focus on the effect of fact checks on its subjects rather than on readers.

    Study Title
    Politicians pre-empt, weaponize and (rarely) accept fact checks
    Study Publication Date
    Study Authors
    Mark Stencel
    Journal
    American Press Institute
    Peer Reviewed
    No
  • Fake news and fact-checking websites both reach about a quarter of the population — but not the same quarter

    The study reviewed web traffic collected with consent from a national sample of 2,525 Americans between Oct. 4 and Nov. 7, 2016. Fake news websites were found reach a relatively large audience, equivalent to 27.4 percent of the sample, with fact-checking websites close behind at 25.3 percent. These two groups overlap only in part, as 13.3 percent of the sample visited fake news websites but not fact-checking websites. Moreover, none of the users who saw a specific fake news story was then reached by its related fact check. The study also found that Facebook was a key channel for misinformation to spread, likely accounting for about one fifth of traffic to fake news websites.

    Study Title
    Selective Exposure to Misinformation: Evidence from the consumption of fake news during the 2016 U.S. presidential campaign
    Study Publication Date
    Study Authors
    Andrew Guess, Brendan Nyhan, Jason Reifler
    Peer Reviewed
    No
  • Social media sentiment analysis could offer lessons on building trust in fact-checking

    Sentences with the phrase "Factcheck.org is" or "Snopes is" were collected from Facebook, Twitter and a selection of discussion forums in the six months from October 2014 to March 2015. Because Facebook crawling is limited to pages with more than 3,500 likes or groups with more than 500 members, the sample was stunted. In the end, 395 posts were coded for Snopes, 130 for StopFake and a mere 80 for Factcheck.org. Facebook pages for the two U.S. sites have hundreds of thousands of likes, so the findings that a majority of comments were negative ought to be read in light of the sample limitations. Still, the paper's coding of comments along themes of usefulness, ability, benevolence and integrity — and splitting across positive and negative sentiment — offers a template for future analysis.

    Study Title
    Trust and Distrust in Online Fact-Checking Services
    Study Publication Date
    Study Authors
    Petter Bae Brandtzaeg, Asbjørn Følstad
    Journal
    Communications of the ACM
    Peer Reviewed
    Yes
 
Email IconGroup 3Facebook IconLinkedIn IconsearchGroupTwitter IconGroup 2YouTube Icon