When we make mistakes

Dear colleagues,

Have you ever known people so stuck in their views that they won't change no matter what evidence you provide? They might have entrenched opinions about refugees, climate change or Uncle Fred. Arguments go nowhere because no matter what you say, they come up with a new justification for their position.

This can be traced to a psychological process called cognitive dissonance. Basically this means that there are two thoughts in your mind that are incompatible. One of them has to give way so that the other can remain.

For example, many people believe justice always prevails. This implies that people get what they deserve. So imagine what happens when someone with this belief encounters rape, poverty or torture. To maintain their belief in a just world, they blame the victim. The woman must have done something to deserve to be raped; those poor people are just lazy; those claims of torture are wrong, or it wasn't really torture, or they deserved it.

Cognitive dissonance is incredibly powerful. There are instances in which someone has been in prison for a decade and then DNA or other evidence shows they are innocent. The original police and prosecutors are faced with the possibility that they did the wrong thing. So what do they do? Many of them continue to believe the person is guilty. They don't believe the evidence, refuse to release the person from prison, refuse to apologise and oppose any compensation.

These and other stories of cognitive dissonance are delightfully presented by Carol Tavris and Elliot Aronson in their book Mistakes were made (but not by me ): why we justify foolish beliefs, bad decisions, and hurtful acts (Orlando, FL: Harcourt, 2007). They have taken cognitive dissonance out of psychology labs and shown its relevance in all sorts of arenas. They provide examples from medicine, law, marriage and wars, backing up their commentary with psychological research findings.

Everyone is subject to the same processes - including you and me. Cognitive dissonance exerts its effects so deeply within the psyche that people seldom notice it's occurring. They think they are being fair and balanced when actually they are biased.

One of the offshoots of cognitive dissonance is confirmation bias. If you have a belief, you are likely to pay more attention to supportive information and ignore or reject conflicting information. If there are two newspaper articles about some contentious issue, one for and one against - let's say on the dispute over Palestine - you are more likely to read the article presenting the view you agree with and not bother with the contrary one. And you will assess the articles with different criteria, lauding the one you agree with and picking holes in the other one. In this way you reinforce your initial opinion.

Cognitive dissonance helps explain the persistence of public controversies over climate change, euthanasia, pornography and many other topics. Partisans on each side do not want to admit they are wrong and are continually reinforced in their positions due to confirmation bias.

Academic researchers are supposed to be unbiased, otherwise how can they claim to be making a contribution to knowledge? The evidence, though, is that most researchers are highly committed to their ideas, just like anyone else. You can ask yourself, "how open am I to completely revising my ideas on the basis of new information?"

Tavris and Aronson say that judges, scientists and physicians "are professionals whose training and culture promote the core value of impartiality, so most become indignant at the mere suggestion that financial or personal interests could contaminate their work. Their professional pride makes them see themselves as being above such matters." (p. 46)

Most teachers like to think of themselves as conscientious and as doing a good job. So when students don't do the assigned work or make complaints, it is tempting to blame the students. There is cognitive dissonance involved between "I am a good teacher" and "My class isn't going well". The easy resolution is to blame someone or something else.

Then there's the challenge of marking student essays that present a viewpoint disliked by the teacher. We like to think we look only at quality, not opinion. But how many have ever checked their marking with colleagues having quite different viewpoints?

Administrators make decisions and implement policies. What happens when evidence comes in that the policy is not working very well? To maintain a belief in yourself as competent, it is tempting to ignore the evidence or to blame others for the problems. Tavris and Aronson say that to get someone to become unethical, it's sufficient to lead them on bit by bit into bad decisions and then let self-justification operate.

Tavris and Aronson give many examples of people who admitted they were wrong, for example President John Kennedy after the 1961 Bay of Pigs fiasco. This is courageous, because it means dealing with one's own cognitive dissonance. It is also surprisingly well received by others. Doctors who admit to patients that they made a mistake are less likely to be sued. Maybe admitting to students about getting something wrong or not knowing the answer wouldn't be so bad after all.

Cognitive dissonance is powerful, but it can be countered, at least sometimes. Mistakes were made (but not by me) shows both psychological pitfalls and ways to become aware of one's own biases. One implication is that all academics should receive training in cognitive biases and how to manage dissonance when confronted with challenging evidence. But imagine the resistance!

Brian
20 February 2011

 

I thank Rae Campbell and Nicola Marks for valuable feedback on drafts.


Go to

Brian's comments to colleagues

Brian Martin's publications

Brian Martin's website