Misinformation

Poynter Results

  • Fake news and fact-checking websites both reach about a quarter of the population — but not the same quarter

    The study reviewed web traffic collected with consent from a national sample of 2,525 Americans between Oct. 4 and Nov. 7, 2016. Fake news websites were found reach a relatively large audience, equivalent to 27.4 percent of the sample, with fact-checking websites close behind at 25.3 percent. These two groups overlap only in part, as 13.3 percent of the sample visited fake news websites but not fact-checking websites. Moreover, none of the users who saw a specific fake news story was then reached by its related fact check. The study also found that Facebook was a key channel for misinformation to spread, likely accounting for about one fifth of traffic to fake news websites.

    Study Title
    Selective Exposure to Misinformation: Evidence from the consumption of fake news during the 2016 U.S. presidential campaign
    Study Publication Date
    Study Authors
    Andrew Guess, Brendan Nyhan, Jason Reifler
    Peer Reviewed
    No
  • For debunking viral rumors, turn to Twitter users

    This paper, presented at the International Conference on Asian Digital Libraries, aims to uncover the types of rumors and "counter-rumors" (or debunks) that surfaced on Twitter following the falsely reported death of former Singaporean Prime Minister Lee Kuan Yew. Researchers analyzed 4,321 tweets about Lee's death and found six categories of rumors, four categories of counter-rumors and two categories belonging to neither. With more counter-rumors than rumors, the study's results suggest that Twitter users often attempt to stop the spread of false rumors online.

    Study Title
    An Analysis of Rumor and Counter-Rumor Messages in Social Media
    Study Publication Date
    Study Authors
    Dion Hoe-Lian Goh, Alton Y.K. Chua, Hanyu Shi, Wenju Wei, Haiyan Wang, Ee Peng Lim
    Journal
    Conference paper
    Peer Reviewed
    Yes
  • How likely you are to believe a rumor has nothing to do with your demographics

    The article’s findings are based on survey data collected from two insurgency-affected areas: southern Thailand and Mindanao, Philippines. The more respondents felt in danger, were repeatedly exposed to a rumor and/or saw one that coincided with their preconceived beliefs, the more likely they were to believe it. That goes against widely held notions that psychology is the end-all, be-all when it comes to whether or not someone believes unverified information. The study contains an interesting precaution for those working on dispelling rumors and misinformation around the world.

    Study Title
    Rumor Has It: The Adoption of Unverified Information in Conflict Zones
    Study Publication Date
    Study Authors
    Kelly M. Greenhill, Ben Oppenheim
    Journal
    International Studies Quarterly
    Peer Reviewed
    Yes
  • Social media comments are just as effective at correcting health misinformation as algorithms

    This study measures the extent to which algorithms and comments on Facebook that link to fact checks can effectively correct users' misconceptions about health news. Researchers tested this by exposing 613 survey participants to simulated news feeds with three condition. Participants were shown misinformation about the Zika virus and different corrective news stories either surfaced by algorithm or posted by another Facebook user. The experimental results found that algorithmic and social distribution of fact checks were equally effective in limiting participants' misperceptions — even for people who are more inclined to believe conspiracy theories. Researchers conclude that this is likely because breaking health news events often deal with new phenomena, which allows for great receptivity to comments and the possibility of opinion change among news consumers early on.

    Study Title
    See Something, Say Something: Correction of Global Health Misinformation on Social Media
    Study Publication Date
    Study Authors
    Leticia Bode, Emily K. Vraga
    Journal
    Health Communication
    Peer Reviewed
    Yes
  • People are less likely to fact-check when they're around other people

    This study of eight experiments aims to measure how social presence affects the way that people verify information online. It found that, when people think they're being judged by a large group of people online, they're less likely to fact-check claims than when they're alone. Inducing vigilance correlated with an increase in fact-checking among respondents, which could imply that, when they're in a group of people, social media users tend to let their guards down. That finding held across a variety of different conditions, including statements that were politically charged and neutral, simulated forums and social media, as well as small vs. large group sizes.

    Study Title
    Perceived social presence reduces fact-checking
    Study Publication Date
    Study Authors
    Youjung Juna, Rachel Menga, Gita Venkataramani Johar
    Journal
    Proceedings of the National Academy of Sciences
    Peer Reviewed
    Yes
  • Fact-checking corrects misperceptions but doesn't affect votes

    This study looks at the effect of partisanship on the likelihood of accepting a factual correction. In two separate studies, four true and four false claims by Donald Trump were presented to a sample of Democrats, non-Trump supporting Republicans and Trump-supporting Republicans. The researchers found that (a) attributing a claim to Trump made his supporters believe it more (b) correcting a Trump falsehood made *all* respondents believe it less, regardless of their political preferences and (c) that the corrections had no effect on voting preferences.

    Study Title
    Processing political misinformation: comprehending the Trump phenomenon
    Study Publication Date
    Study Authors
    Briony Swire, Adam J. Berinsky, Stephan Lewandowsky, Ullrich K. H. Ecker
    Journal
    Royal Society of Open Science
    Peer Reviewed
    Yes
  • Related links on Facebook could help correct misinformation

    This study experiments with a feature that lists related stories underneath existing posts on Facebook in order to determine whether social media helps reinforce or correct users' misperceptions. In a web-based survey with 524 people recruited from a university, participants viewed separate screens of Facebook user news feeds in which they were presented with posts whose related articles all confirmed the misperception, all refuted the misperception or a mixed condition. Stories focused on attitudes toward GMOs and illness, as well as attitudes toward vaccination and autism. The experimental results suggest that attitudes based on misperceptions about GMOs can be changed via exposure to corrective information on social media. Interestingly, researchers also that, while people decreased their evaluations of related news stories when they contradicted pre-existing beliefs, those stories still changed some attitudes among those who believed the misperception.

    Study Title
    In Related News, That Was Wrong: The Correction of Misinformation Through Related Stories Functionality in Social Media
    Study Publication Date
    Study Authors
    Leticia Bode, Emily K. Vraga
    Journal
    Journal of Communication
    Peer Reviewed
    Yes
Email IconGroup 3Facebook IconLinkedIn IconsearchGroupTwitter IconGroup 2YouTube Icon