Imagine you're doing some routine part of your job, maybe advising a student, collecting and entering data, or planning an expenditure. Surely you can do this in a fair, unbiased way.
Think again! Psychologists have been studying the complexities of thinking for decades, coming up with findings about cognitive biases and illusions that are surprising because people are unaware of them.
Now it's possible to read all about these matters in an up-to-date, comprehensive, accessible treatment: Daniel Kahneman's book Thinking, Fast and Slow. Kahneman, a psychologist, won the Nobel Prize for economics for his contributions about decision-making. Thinking, Fast and Slow is an accessible account of work in the field, much of it Kahneman's own research in collaboration with Amos Tversky, who did not live to share the Nobel Prize.
The title of the book refers to two mental systems, one fast and intuitive, the other slow and cautious. The two systems, called by Kahneman Systems 1 and 2, provide a convenient metaphor for how the mind works. Academics like to imagine they use the slow system for research and pedagogy and university decision-makers like to imagine they use it for policy formulation and implementation.
The two systems usually work efficiently, but there is much to learn from their failures, and how to become aware of them and overcome them. Here are a few sample points.
** "... when people believe a conclusion is true, they are also very likely to believe arguments that appear to support it, even when these arguments are unsound." (p. 45)
** "A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth. Authoritarian institutions and marketers have always known this fact." Even familiarity with a phrase can lead to acceptance to statements that incorporate the phrase. (p. 62)
** To use the "principle of independent judgments", get everyone at a meeting to write down their views before an issue is discussed, thereby taking advantage of diversity of opinion and knowledge. "The standard practice of open discussion gives too much weight to the opinions of those who speak early and assertively, causing others to line up behind them." (p. 85)
** "When people were favorably disposed toward a technology, they rated it as offering large benefits and imposing little risk; when they disliked a technology, they could think only of its disadvantages, and few advantages came to mind." (p. 139)
** When Kahneman informed executives that success in building portfolios was due to luck, not skill, they listened but didn't change anything, because the message challenged their basic assumptions, and hence was not absorbed. (p. 216)
** "... to maximize predictive accuracy, final decisions should be left to formulas, especially in low-validity environments". For examples, formulas are better than interviews for selecting candidates for jobs or students for medical school. (p. 225)
** When a group converges to a decision, doubts are suppressed, especially when leaders reveal their preferences. (p. 265)
** "I call it theory-induced blindness: once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws." When flaws are stumbled across, they are dismissed as having some explanation. (p. 277) (Researchers who can overcome this sort of blindness can make pioneering contributions.)
ии Public-health professionals were just as influenced by framing effects as others: their decisions depended on which of two equivalent formulations was posed, highlighting either lives saved or lives lost. (p. 369)
Thinking, Fast and Slow is a large book. It is straightforward to follow, but there is such a lot to digest that it is best taken in small doses. We are so used to our modes of thinking, and so oblivious to their flaws, that it can take years to become accustomed to a different approach. It is worth the effort. There are lessons for everyone.
20 August 2012
Thanks to Scott Burrows, Xiaoping Gao and Frank Huang for helpful comments.
Brian's comments to colleagues
Brian Martin's publications
Brian Martin's website