Backfire Effect

“Let me never fall into the vulgar mistake of dreaming that I am persecuted whenever I am contradicted.” —Ralph Waldo Emerson

In an ideal world, when you’re presented with contradicting evidence to your current way of thinking, you’d correct your beliefs and then move forward with a better understanding of the world.

Unfortunately we don’t live in that world.

The Backfire Effect

There’s a particularly nasty cognitive bias called the Backfire Effect that says once a belief is integrated into your way of thinking, you will protect that belief more strongly when you feel it is under attack.

What that means for you is, every time you try to win an argument with logic, you’re actually making the other person believe even more strongly. (Ever tried explaining why a conspiracy theory believer is wrong? You know exactly what I’m talking about.)

This was clearly shown in an experiment run in 2006 by Brendan Nyhan and Jason Reifler. They would show people articles they fabricated that seemed to support something that was demonstrably untrue. Then, Nyhan & Reifler would show participants the facts. Surprisingly, participants would double down on their misconceptions.

The Backfire Effect is the other side of the coin from the confirmation bias. Confirmation bias filters information you look for while the Backfire Effect protects you from information that’s found you.

This is exactly why no matter what kind of scandal is uncovered, candidates gain support.

All your hard work of persuading someone will backfire on you with equal & opposite force. It’s Newton’s Second Law of Internet Discussion Dynamics.

One Event, Two Outcomes

In high school & college I was a competitive debater. I wasn’t naturally well-spoken and quick on my feet. I think almost entirely in pictures, so it was difficult for me to translate those images into coherent ideas that are easy to understand in words.

With that background, I absolutely love watching live debates; especially debates that matter.

Recently the two main party nominees squared off for the first presidential debate of the 2016 circus election cycle. At the end of the debate, there was a clear winner.

Who was it?

Turns out, it was the person you believed would win it before it ever began.

Aftermath

Like most interesting quirks of the mind, how things play out after an event are often more interesting than the event, itself. Nowhere is this easier to see than the fallout from the debate.

If you’ve talked with more than 10 people about the debate, you should have seen first hand how two people can go through the same experience, and come out with completely different beliefs about what happened.

As soon as the nominees wrapped up, you knew beyond a shadow of a doubt that Hillary won the debate. The guy sitting next to you at the bar felt exactly the same way. . . about Trump.

And it’s interesting to note; he has the exact same level of certainty about Trump’s “undeniable” win, as you do Hillary.

How in the world can that be?

A little hiccup in reasoning called “belief bias.”

Belief Bias

Belief bias is what happens when someone’s beliefs, personal values, prior knowledge/experience colors the reasoning process to be more accepting of invalid arguments or information.

Those beliefs act as a preventative filter for any kind of information that would disrupt the world view that’s working just fine, thank you. Why would I do anything different?

A completely rational person would be able to take in all points made, evaluate claims, and come to a conclusion based solely on that data.

But we’re not completely rational people. Our fuzzy logic & slippery pre-conscious brain processes get in the way.

We interpret experiences so they support what we believe already.

Show me one person who changed their mind after the debates. . .

Can’t do it?

You have the belief bias to thank!

(and be scared of.)