February 27, 2015

dress

Yesterday’s insane Internet debate over the color of a dress offers a critical lesson that every journalist must incorporate into their daily work.

This lesson has nothing to do with viral content, fashion, BuzzFeed, social media, the future of media, Tumblr, or audience engagement.

Many of us looked at a very simple photo of a dress and saw something different. This had nothing to do with intelligence, experience, fashion sense or any other personal characteristic.

We are all at the mercy of our brains and its cognitive processes. Our eyes took in the information in front of us, our brains processed it, and in many cases it gave us the wrong answer. But the fact that it was coming from our brain meant that it seemed like exactly the right answer. People insisted on what they were seeing because it was what they were actually seeing.

We don’t go about our daily lives assuming that own brains — and our eye — can give us faulty information.

They do, all the time.

The simple truth is our brains process information in ways that can lead us astray. This is something every journalist needs to be aware of and account for in the work we do.

We have cognitive biases that affect how we gather, evaluate and retain information. We suffer from pareidolia, “the human tendency to read significance into random or vague stimuli (both visual and auditory).” We see patterns where there aren’t any.

Cambridge neuroscientist Daniel Bor’s book “The Ravenous Brain” explains our desire to find order amidst chaos. Here’s a relevant excerpt quoted by Brain Pickings:

Perhaps what most distinguishes us humans from the rest of the animal kingdom is our ravenous desire to find structure in the information we pick up in the world. We cannot help actively searching for patterns — any hook in the data that will aid our performance and understanding.

It sound like a good thing, and it can be. But it can also lead us astray, Bor writes:

One problematic corollary of this passion for patterns is that we are the most advanced species in how elaborately and extensively we can get things wrong. We often jump to conclusions — for instance, with astrology or religion. We are so keen to search for patterns, and so satisfied when we’ve found them, that we do not typically perform sufficient checks on our apparent insights.

The dress is a reminder that we sometimes see things that aren’t there, misperceive what’s right in front of us, and otherwise fall victim to our own brains.

This is particularly true when it comes to the way we process information. Once we have made up our minds — or decided on an angle for our story — we assimilate information in accordance with that view.

“[W]e humans quickly develop an irrational loyalty to our beliefs, and work hard to find evidence that supports those opinions and to discredit, discount or avoid information that does not,” wrote Cordelia Fine, the author of “A Mind of Its Own: How Your Brain Distorts and Deceives, in The New York Times.”

Journalists are told to be aware of the biases of sources. But we must also be constantly aware of, and seeking to mitigate, our own cognitive biases.

My new Tow Centre research report about online rumors and how news organizations debunk misinformation offered a look at several cognitive biases that leads us and others astray, and that make debunking difficult.

Below is an edited excerpt from my report that outlines five phenomena and biases that every journalist needs to be aware of in our daily work.

So, from now on, when we’re gathering information, speaking with people, and selecting what to include and emphasize and what to exclude, think of that dress.

Let it be a reminder of the fact that what we think we are seeing, hearing and understanding may in fact have no connection to fact.

The Backfire Effect

In a post on the blog You Are Not So Smart, journalist David McRaney offered a helpful one-sentence definition of the backfire effect: “When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.”

McRaney delved further into the backfire effect in his book, You Are Now Less Dumb: How to Conquer Mob Mentality, How to Buy Happiness, and All the Other Ways to Outsmart Yourself. He offered this summary of how it manifests itself in our minds and actions:

Once something is added to your collection of beliefs, you protect it from harm. You do this instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens those misconceptions instead.

Confirmation Bias

Confirmation bias is the process by which we cherry-pick data to support what we believe. If we are convinced of an outcome, we will pay more attention to the data points and information that support it. Our minds, in effect, are made up and everything we see and hear conforms to this idea. It’s tunnel vision.

A paper published in the “Review of General Psychology”defined it as “the seeking or interpreting of evidence in ways that are partial to existing beliefs, expectations, or a hypothesis in hand.”

Here’s how a Wall Street Journal article translated its effects for the business world: “In short, your own mind acts like a compulsive yes-man who echoes whatever you want to believe.”

Confirmation bias makes us blind to contradictory evidence and facts. For journalists, it often manifests itself as an unwillingness to pay attention to facts and information that go against our predetermined angle for a story.

Motivated Reasoning

Psychologist Leon Festinger wrote, “A man with a conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.”

We think of ourselves as rational beings who consider the evidence and information placed in front of us. This is often not the case. We are easily persuaded by information that fits with our beliefs and we harshly judge and dismiss contradictory details and evidence. Our ability to reason is therefore affected (motivated) by our preexisting beliefs.

“In particular, people are motivated to not only seek out information consis- tent with their prior attitudes, beliefs, and opinions, but also readily accept attitude-confirming evidence while critically counterarguing attitude- challenging information,” wrote Brian E. Weeks, in his paper “Feeling is Believing? The Influence of Emotions on Citizens’ False Political Beliefs.” “Information supporting one’s prior attitude is more likely to be deemed credible and strong, while attitude-discrepant information is often viewed as weak and ultimately dismissed.”

Motivated reasoning and confirmation bias are similar in many ways. In “Kluge: The Haphazard Evolution of the Human Mind,” psychologist Gary Marcus expressed the difference this way: “Whereas confirmation bias is an automatic tendency to notice data that fit with our beliefs, motivated reasoning is the complementary tendency to scrutinize ideas more carefully if we don’t like them than if we do.”

Biased Assimilation

Fitting well with motivated reasoning is the process of biased assimilation. In “True Enough,” [Farhad] Manjoo defined it as the tendency for people to “interpret and understand new information in a way that accords with their own views.” (He cited research by psychologists Charles Lord, Lee Ross, and Mark Lepper from their paper, “Biased Assimilation and Attitude Polarization: The Effect of Prior Theories on Subsequently Considered Evidence.”) Simply put, we interpret and understand new information in a way that fits with what we already know or believe.

Group Polarization

Group polarization is what happens to existing beliefs when we engage in a group discussion. If we’re speaking with people who share our view, the tendency is for all of us to become even more vehement about it. “Suppose that members of a certain group are inclined to accept a rumor about, say, the malevolent intentions of an apparently unfriendly group or nation,” wrote Cass Sunstein in “On Rumors.” “In all likelihood, they will become more committed to that rumor after they have spoken among themselves.”

If we start a conversation with a tentative belief about an issue, being in a room with people who strongly believe it will inevitably pull us further in that direction. This is important to keep in mind in the context of online communities. A 2010 study by researchers from the Georgia Institute of Technology and Microsoft Research examined group polarization on Twitter. They saw that “replies between like-minded individuals strengthen group identity,” reflecting this group dynamic. When it came to engaging with people and viewpoints that were outside of what they personally felt, Twitter users were “limited in their ability to engage in meaningful discussion.”

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Craig Silverman (craig@craigsilverman.ca) is an award-winning journalist and the founder of Regret the Error, a blog that reports on media errors and corrections, and trends…
Craig Silverman

More News

Back to News

Comments

Comments are closed.