Social Media

Poynter Results

  • For debunking viral rumors, turn to Twitter users

    This paper, presented at the International Conference on Asian Digital Libraries, aims to uncover the types of rumors and "counter-rumors" (or debunks) that surfaced on Twitter following the falsely reported death of former Singaporean Prime Minister Lee Kuan Yew. Researchers analyzed 4,321 tweets about Lee's death and found six categories of rumors, four categories of counter-rumors and two categories belonging to neither. With more counter-rumors than rumors, the study's results suggest that Twitter users often attempt to stop the spread of false rumors online.

    Study Title
    An Analysis of Rumor and Counter-Rumor Messages in Social Media
    Study Publication Date
    Study Authors
    Dion Hoe-Lian Goh, Alton Y.K. Chua, Hanyu Shi, Wenju Wei, Haiyan Wang, Ee Peng Lim
    Journal
    Conference paper
    Peer Reviewed
    Yes
  • Labeling some fake stories on social media increases the believability of untagged fake stories

    This study examines the effects of adding disputed labels to fake news stories on social media outlets like Facebook, in line with the real partnership the social network launched in December 2016. While researchers found that adding warnings to fake content decreased those posts perceived accuracy, they also found that the presence of fake news tags alone also increased the perceived accuracy of untagged fake stories. This "implied truth" effect was stronger among subgroups who were more likely to believe online information (such as young adults and Trump supporters). Participants saw an equal mix of right and left-wing headlines, both fake and real, and answered questions about their validity and shareability.

    Study Title
    The Implied Truth Effect: Attaching Warnings to a Subset of Fake News Stories Increases Perceived Accuracy of Stories Without Warnings
    Study Publication Date
    Study Authors
    Gordon Pennycook, David G. Rand
    Journal
    SSRN
    Peer Reviewed
    No
  • Low critical thinking may determine whether you believe in fake news

    Respondents were showed "Facebook-like" posts carrying real or fake news. Across three different study designs, respondents with higher results on a Cognitive Reflection Test were found to be less likely to incorrectly rate as accurate a fake news headline. Analytic thinking was associated with more accurate spotting of fake and real news independent of respondents' political ideology. This would suggest that building critical thinking skills could be an effective instrument against fake news.

    Study Title
    Who Falls for Fake News? The Roles of Analytic Thinking, Motivated Reasoning, Political Ideology, and Bullshit Receptivity
    Study Publication Date
    Study Authors
    Gordon Pennycook, David G. Rand
    Journal
    SSRN
    Peer Reviewed
    No
  • The most effective way to fact-check is to create counter-messages

    Researchers examined a final selection of 20 experiments from 1994 to 2015 that address fake social and political news accounts in order to determine the most effective ways to combat beliefs based on misinformation. The headline finding is that correcting misinformation is possible, but it's often not as strong as the misinformation itself. The analysis has several take-aways for fact-checkers, most notably the importance of creating counter-messages and alternative narratives if they want to change their audiences’ minds and getting on to the correction as quickly as possible.

    Study Title
    Debunking: A Meta-Analysis of the Psychological Efficacy of Messages Countering Misinformation
    Study Publication Date
    Study Authors
    Man-pui Sally Chan, Christopher R. Jones, Kathleen Hall Jamieson, Dolores Albarracín
    Journal
    Psychological Science
    Peer Reviewed
    Yes
  • Twitter users are more likely to accept correction from people they know

    The study looked at corrections made on Twitter between January 2012 and April 2014 to see how fact-checking is received by people with different social relationships. Researchers ultimately isolated 229 “triplets” where the person sharing a falsehood responds to a correction by a second tweeter. Corrections made by “friends” resulted in the person sharing a falsehood accepting the fact 73 percent of the time. Corrections made by strangers were accepted only 39 percent of the time. Put simply: When we’re wrong on Twitter, we’re more likely to own up to it if someone we know corrected us.

    Study Title
    Political Fact-Checking on Twitter: When Do Corrections Have an Effect?
    Study Publication Date
    Study Authors
    Drew B. Margolin, Aniko Hannak, Ingmar Weber
    Journal
    Political Communication
    Peer Reviewed
    Yes
  • Social media sentiment analysis could offer lessons on building trust in fact-checking

    Sentences with the phrase "Factcheck.org is" or "Snopes is" were collected from Facebook, Twitter and a selection of discussion forums in the six months from October 2014 to March 2015. Because Facebook crawling is limited to pages with more than 3,500 likes or groups with more than 500 members, the sample was stunted. In the end, 395 posts were coded for Snopes, 130 for StopFake and a mere 80 for Factcheck.org. Facebook pages for the two U.S. sites have hundreds of thousands of likes, so the findings that a majority of comments were negative ought to be read in light of the sample limitations. Still, the paper's coding of comments along themes of usefulness, ability, benevolence and integrity — and splitting across positive and negative sentiment — offers a template for future analysis.

    Study Title
    Trust and Distrust in Online Fact-Checking Services
    Study Publication Date
    Study Authors
    Petter Bae Brandtzaeg, Asbjørn Følstad
    Journal
    Communications of the ACM
    Peer Reviewed
    Yes
  • Social media comments are just as effective at correcting health misinformation as algorithms

    This study measures the extent to which algorithms and comments on Facebook that link to fact checks can effectively correct users' misconceptions about health news. Researchers tested this by exposing 613 survey participants to simulated news feeds with three condition. Participants were shown misinformation about the Zika virus and different corrective news stories either surfaced by algorithm or posted by another Facebook user. The experimental results found that algorithmic and social distribution of fact checks were equally effective in limiting participants' misperceptions — even for people who are more inclined to believe conspiracy theories. Researchers conclude that this is likely because breaking health news events often deal with new phenomena, which allows for great receptivity to comments and the possibility of opinion change among news consumers early on.

    Study Title
    See Something, Say Something: Correction of Global Health Misinformation on Social Media
    Study Publication Date
    Study Authors
    Leticia Bode, Emily K. Vraga
    Journal
    Health Communication
    Peer Reviewed
    Yes
  • People are less likely to fact-check when they're around other people

    This study of eight experiments aims to measure how social presence affects the way that people verify information online. It found that, when people think they're being judged by a large group of people online, they're less likely to fact-check claims than when they're alone. Inducing vigilance correlated with an increase in fact-checking among respondents, which could imply that, when they're in a group of people, social media users tend to let their guards down. That finding held across a variety of different conditions, including statements that were politically charged and neutral, simulated forums and social media, as well as small vs. large group sizes.

    Study Title
    Perceived social presence reduces fact-checking
    Study Publication Date
    Study Authors
    Youjung Juna, Rachel Menga, Gita Venkataramani Johar
    Journal
    Proceedings of the National Academy of Sciences
    Peer Reviewed
    Yes
  • On social media, users believe corrections if they include sources

    This study attempts to determine the most effective way to correct misinformation on social media by testing both the content of corrections and how they're presented. In a survey with 613 valid responses, of which 271 were analyzed, participants saw either a simulated Facebook or Twitter feed and were assigned to one of three conditions with varying levels of misinformation and corrections, both with and without sources. Based on the experimental results, researchers found that, when everyday users share corrections on social media, linking to credible sources increases the probability that other users will believe the corrections. In the control condition, in which participants weren't shown corrections with sources, misperceptions were largely unaltered. On Facebook, linked sources in comments on articles led to increased perceptions of credibility, while the same effect was absent in Twitter replies.

    Study Title
    I do not believe you: how providing a source corrects health misperceptions across social media platforms
    Study Publication Date
    Study Authors
    Emily K. Vraga, Leticia Bode
    Journal
    Information, Communication & Society
    Peer Reviewed
    Yes
  • The more partisan your online news diet, the less likely you are to believe fact-checkers

    This study was conducted ahead of the 2012 presidential election. Respondents were asked whether they were aware of experts' conclusions on four political misconceptions, whether they believed them and which online news outlets they consumed. Frequent conservative online news consumers had a 33 percent chance of being wrong about President Obama's birth certificate despite knowing what most journalists had concluded about it. Only 3 percent of those not reading conservative news held that same belief. Conversely, a frequent liberal online news user had a 10 percent chance of being wrong about Mitt Romney outsourcing jobs during his tenure at Bain even though they correctly indicated what fact-checkers findings were. Researchers concluded that there may be a relationship between partisan media use and political misconceptions.

    Study Title
    Driving a Wedge Between Evidence and Beliefs: How Online Ideological News Exposure Promotes Political Misperceptions
    Study Publication Date
    Study Authors
    R. Kelly Garrett, Brian E. Weeks, Rachel L. Neo
    Journal
    Journal of Computer-Mediated Communication
    Peer Reviewed
    Yes
Email IconGroup 3Facebook IconLinkedIn IconsearchGroupTwitter IconGroup 2YouTube Icon