Misperceptions

Poynter Results

  • Bullshitting is socially constructed

    In this study, the author aims to figure out what makes people take up bullshitting, or "communications that result from little to no concern for truth, evidence and/or established semantic, logical, systemic, or empirical knowledge." To do that, he ran two separate experiments: One in which he tested how social conditions affect one’s likelihood to bullshit and the other in which he analyzed how being held accountable affected bullshitting. In the first, the author used data from a questionnaire that 594 participants on Amazon's Mechanical Turk platform filled out and found that bullshitting was largely fueled by social pressures. In the second, the author drew upon questionnaire data from 234 undergraduate psychology students and found that behavior is augmented when people feel like they won’t be held accountable for or have to explain their bullshit.

     

    Study Title
    Antecedents of bullshitting
    Study Publication Date
    Study Authors
    John V.Petrocelli
    Journal
    Journal of Experimental Social Psychology
    Peer Reviewed
    Yes
    Sample
    Non-representative
    Inferential approach
    Experimental
    Number of studies citing
    0
  • Video or text? Evaluating the relative efficacy of different formats for fact-checking.

    The research team, which included Factcheck.org co-founder Kathleen Hall Jamieson, presented a sample of 525 online respondents with a deceptive claim on the Keystone XL pipeline included in a political flyer. They were then presented either (a) a textual fact check of the claim (b) a humorous video fact-checking that claim (c) a non humorous video fact check (d) an unrelated humorous video of a baby singing (e) nothing at all. Belief in the deceptive claim fell more significantly among participants who viewed either fact-checking video than among those who read the article.

    Study Title
    Fact-Checking Effectiveness as a Function of Format and Tone: Evaluating FactCheck.org and FlackCheck.org
    Study Publication Date
    Study Authors
    Dannagal G. Young, Kathleen Hall Jamieson, Shannon Poulsen, and Abigail Goldring
    Journal
    Journalism & Mass Communication Quarterly
    Peer Reviewed
    Yes
  • In surveys, respondents 'say what they mean and mean what they say'

    Through four different experiments, this study tries to separate genuine belief in two polarizing conspiracy theories — "Obama is Muslim" and "9/11 was an inside job" — from expressive responses, sometimes called "partisan cheerleading." In the first experiment, respondents are explicitly asked to respond regardless of how they feel about the people and policies mentioned. In the second, some respondents were told that sometimes people "say they do believe [false rumors] so they can say something bad about the people and policies mentioned." In the third, respondents who rejected the rumor could skip to the end of the survey. In the fourth and final the rumor was inserted in a list of items respondents could agree or disagree with. Across the board, Berinsky found very low rates of expressive responding, leading him to conclude that “it seems that when people answer survey questions, they say what they mean and they mean what they say.”

    Study Title
    Telling the Truth about Believing the Lies? Evidence for the Limited Prevalence of Expressive Survey Responding
    Study Publication Date
    Study Authors
    Adam Berinsky
    Journal
    The Journal of Politics
    Peer Reviewed
    Yes
  • How likely you are to believe a rumor has nothing to do with your demographics

    The article’s findings are based on survey data collected from two insurgency-affected areas: southern Thailand and Mindanao, Philippines. The more respondents felt in danger, were repeatedly exposed to a rumor and/or saw one that coincided with their preconceived beliefs, the more likely they were to believe it. That goes against widely held notions that psychology is the end-all, be-all when it comes to whether or not someone believes unverified information. The study contains an interesting precaution for those working on dispelling rumors and misinformation around the world.

    Study Title
    Rumor Has It: The Adoption of Unverified Information in Conflict Zones
    Study Publication Date
    Study Authors
    Kelly M. Greenhill, Ben Oppenheim
    Journal
    International Studies Quarterly
    Peer Reviewed
    Yes
  • Correcting historical misperceptions works — but it's not magic

    This study seeks to explain whether or not corrective information affects the views Jewish Israelis hold about the conflict with Palestine. Researchers randomized an experiment in which an online sample of 2,170 Jewish Israelis ages 18 or older either received solely an extremist message, which denied Israeli wrongdoing in the 1948 Palestinian exodus, or that message plus corrective information about the conflict. They also randomized participants’ feelings of high or low control. While the proportion of Jewish Israelis who denied wrongdoing in the conflict with Palestine increased by 8 percent from the baseline to the low control, uncorrected condition, the prevalence of denialism decreased by between 5 and 11 percent for the inverse conditions. The findings suggest that when people are induced to feel a lack of control, they’re more vulnerable to a denialist message — but corrective information is still quite effective.

    Study Title
    Fighting the Past: Perceptions of Control, Historical Misperceptions, and Corrective Information in the Israeli-Palestinian Conflict
    Study Publication Date
    Study Authors
    Brendan Nyhan, Thomas Zeitzoff
    Journal
    Political Psychology
    Peer Reviewed
    Yes
  • Giving corrective economic information works, but it doesn't change views

    The study supplied people with past economic data, then asked them what they think of the current state of the U.K. economy. The researchers found that while partisanship was a key part of how people viewed the economy in the U.K., most people’s economic perceptions were rooted in real economic indicators, like job growth and unemployment. And — most importantly — people who held inaccurate views of the economy generally changed them when presented with corrective information. The essential results are similar to those posited by other work — corrections work, but only to a certain extent.

    Study Title
    Facing up to the facts: What causes economic perceptions?
    Study Publication Date
    Study Authors
    Catherine E. De Vries, Sara B. Hobolt, James Tilley
    Journal
    Electoral Studies
    Peer Reviewed
    Yes
  • Low critical thinking may determine whether you believe in fake news

    Respondents were showed "Facebook-like" posts carrying real or fake news. Across three different study designs, respondents with higher results on a Cognitive Reflection Test were found to be less likely to incorrectly rate as accurate a fake news headline. Analytic thinking was associated with more accurate spotting of fake and real news independent of respondents' political ideology. This would suggest that building critical thinking skills could be an effective instrument against fake news.

    Study Title
    Who Falls for Fake News? The Roles of Analytic Thinking, Motivated Reasoning, Political Ideology, and Bullshit Receptivity
    Study Publication Date
    Study Authors
    Gordon Pennycook, David G. Rand
    Journal
    SSRN
    Peer Reviewed
    No
  • Voters gradually change their opinions when presented the facts

    In this study, respondents were given a factual question like "From 2009, when President Obama took office, to 2012, median household income adjusted for inflation in the United States fell by more than 4 percent" and asked to rate it as "True" or "False." Over the course of four subsequent rounds, they were given signals that the information was indeed accurate or not and told that these signals were right 75% of the time. The results indicate that respondents updated their beliefs towards the correct answer regardless of their partisan preference. The study's elaborate design makes it hard for fact-checkers to draw real life lessons. However, it does seem to offer additional evidence that fact-checking doesn't fall on deaf ears.

    Study Title
    Learning Together Slowly: Bayesian Learning about Political Facts
    Study Publication Date
    Study Authors
    Seth J. Hill
    Journal
    The Journal of Politics
    Peer Reviewed
    Yes
  • Social media comments are just as effective at correcting health misinformation as algorithms

    This study measures the extent to which algorithms and comments on Facebook that link to fact checks can effectively correct users' misconceptions about health news. Researchers tested this by exposing 613 survey participants to simulated news feeds with three condition. Participants were shown misinformation about the Zika virus and different corrective news stories either surfaced by algorithm or posted by another Facebook user. The experimental results found that algorithmic and social distribution of fact checks were equally effective in limiting participants' misperceptions — even for people who are more inclined to believe conspiracy theories. Researchers conclude that this is likely because breaking health news events often deal with new phenomena, which allows for great receptivity to comments and the possibility of opinion change among news consumers early on.

    Study Title
    See Something, Say Something: Correction of Global Health Misinformation on Social Media
    Study Publication Date
    Study Authors
    Leticia Bode, Emily K. Vraga
    Journal
    Health Communication
    Peer Reviewed
    Yes
  • On social media, users believe corrections if they include sources

    This study attempts to determine the most effective way to correct misinformation on social media by testing both the content of corrections and how they're presented. In a survey with 613 valid responses, of which 271 were analyzed, participants saw either a simulated Facebook or Twitter feed and were assigned to one of three conditions with varying levels of misinformation and corrections, both with and without sources. Based on the experimental results, researchers found that, when everyday users share corrections on social media, linking to credible sources increases the probability that other users will believe the corrections. In the control condition, in which participants weren't shown corrections with sources, misperceptions were largely unaltered. On Facebook, linked sources in comments on articles led to increased perceptions of credibility, while the same effect was absent in Twitter replies.

    Study Title
    I do not believe you: how providing a source corrects health misperceptions across social media platforms
    Study Publication Date
    Study Authors
    Emily K. Vraga, Leticia Bode
    Journal
    Information, Communication & Society
    Peer Reviewed
    Yes
 
Email IconGroup 3Facebook IconLinkedIn IconsearchGroupTwitter IconGroup 2YouTube Icon