In this study, the author aims to figure out what makes people take up bullshitting, or "communications that result from little to no concern for truth, evidence and/or established semantic, logical, systemic, or empirical knowledge." To do that, he ran two separate experiments: One in which he tested how social conditions affect one’s likelihood to bullshit and the other in which he analyzed how being held accountable affected bullshitting. In the first, the author used data from a questionnaire that 594 participants on Amazon's Mechanical Turk platform filled out and found that bullshitting was largely fueled by social pressures. In the second, the author drew upon questionnaire data from 234 undergraduate psychology students and found that behavior is augmented when people feel like they won’t be held accountable for or have to explain their bullshit.
The research team, which included Factcheck.org co-founder Kathleen Hall Jamieson, presented a sample of 525 online respondents with a deceptive claim on the Keystone XL pipeline included in a political flyer. They were then presented either (a) a textual fact check of the claim (b) a humorous video fact-checking that claim (c) a non humorous video fact check (d) an unrelated humorous video of a baby singing (e) nothing at all. Belief in the deceptive claim fell more significantly among participants who viewed either fact-checking video than among those who read the article.
Through four different experiments, this study tries to separate genuine belief in two polarizing conspiracy theories — "Obama is Muslim" and "9/11 was an inside job" — from expressive responses, sometimes called "partisan cheerleading." In the first experiment, respondents are explicitly asked to respond regardless of how they feel about the people and policies mentioned. In the second, some respondents were told that sometimes people "say they do believe [false rumors] so they can say something bad about the people and policies mentioned." In the third, respondents who rejected the rumor could skip to the end of the survey. In the fourth and final the rumor was inserted in a list of items respondents could agree or disagree with. Across the board, Berinsky found very low rates of expressive responding, leading him to conclude that “it seems that when people answer survey questions, they say what they mean and they mean what they say.”
The article’s findings are based on survey data collected from two insurgency-affected areas: southern Thailand and Mindanao, Philippines. The more respondents felt in danger, were repeatedly exposed to a rumor and/or saw one that coincided with their preconceived beliefs, the more likely they were to believe it. That goes against widely held notions that psychology is the end-all, be-all when it comes to whether or not someone believes unverified information. The study contains an interesting precaution for those working on dispelling rumors and misinformation around the world.
This study seeks to explain whether or not corrective information affects the views Jewish Israelis hold about the conflict with Palestine. Researchers randomized an experiment in which an online sample of 2,170 Jewish Israelis ages 18 or older either received solely an extremist message, which denied Israeli wrongdoing in the 1948 Palestinian exodus, or that message plus corrective information about the conflict. They also randomized participants’ feelings of high or low control. While the proportion of Jewish Israelis who denied wrongdoing in the conflict with Palestine increased by 8 percent from the baseline to the low control, uncorrected condition, the prevalence of denialism decreased by between 5 and 11 percent for the inverse conditions. The findings suggest that when people are induced to feel a lack of control, they’re more vulnerable to a denialist message — but corrective information is still quite effective.
The study supplied people with past economic data, then asked them what they think of the current state of the U.K. economy. The researchers found that while partisanship was a key part of how people viewed the economy in the U.K., most people’s economic perceptions were rooted in real economic indicators, like job growth and unemployment. And — most importantly — people who held inaccurate views of the economy generally changed them when presented with corrective information. The essential results are similar to those posited by other work — corrections work, but only to a certain extent.
Respondents were showed "Facebook-like" posts carrying real or fake news. Across three different study designs, respondents with higher results on a Cognitive Reflection Test were found to be less likely to incorrectly rate as accurate a fake news headline. Analytic thinking was associated with more accurate spotting of fake and real news independent of respondents' political ideology. This would suggest that building critical thinking skills could be an effective instrument against fake news.
In this study, respondents were given a factual question like "From 2009, when President Obama took office, to 2012, median household income adjusted for inflation in the United States fell by more than 4 percent" and asked to rate it as "True" or "False." Over the course of four subsequent rounds, they were given signals that the information was indeed accurate or not and told that these signals were right 75% of the time. The results indicate that respondents updated their beliefs towards the correct answer regardless of their partisan preference. The study's elaborate design makes it hard for fact-checkers to draw real life lessons. However, it does seem to offer additional evidence that fact-checking doesn't fall on deaf ears.
This study measures the extent to which algorithms and comments on Facebook that link to fact checks can effectively correct users' misconceptions about health news. Researchers tested this by exposing 613 survey participants to simulated news feeds with three condition. Participants were shown misinformation about the Zika virus and different corrective news stories either surfaced by algorithm or posted by another Facebook user. The experimental results found that algorithmic and social distribution of fact checks were equally effective in limiting participants' misperceptions — even for people who are more inclined to believe conspiracy theories. Researchers conclude that this is likely because breaking health news events often deal with new phenomena, which allows for great receptivity to comments and the possibility of opinion change among news consumers early on.