top of page

Thinking Fast and Slow

(Originally published June 2012)

Daniel Kahneman’s Thinking Fast and Slow (2011) is about the psychology of judgment and decision making. Kahneman has been described by Steven Pinker as “among the most influential psychologists in history and certainly the most important psychologist alive today.”

Kahneman (along with his deceased friend and colleague, Amos Tversky) has done research on cognitive processes for decades, and is surely one the world’s leading experts on biases. Thinking Fast and Slow is a summary of his (and Tversky’s) research on heuristics, i.e., short cuts and strategies human beings use to make difficult judgments quickly and confidently, and of a large body of research done by other scholars on the same subject. The study of heuristics and biases has important practical applications for any profession (like child welfare) in which high stakes decision making based on limited knowledge and often under severe time pressures is unavoidable.

 

Kahneman’s approach in Thinking Fast and Slow is to describe cognitive processing “by the metaphor of two agents, called System 1 and System 2, which respectively produce fast and slow thinking;” and he adds “I speak of the features of intuitive and deliberative thought as if they were traits of two characters in your mind.” Describing mental operations as distinct characters in the human drama raises questions about the relationship between these Systems and theoretical questions about how they are integrated in unified mental functioning. Kahneman has interesting but limited information regarding the theoretical questions, but has an unequivocal answer to the question about the relationship between automatic thinking and deliberative thinking: System 1 is the dominant character due to its confidence, boldness, activity level and powerful connection to emotions. System 2 believes it is in charge based on its conscious awareness of deliberate mental operations, but it is mistaken. System 2 is lazy, and for the most part, it does System 1s’ bidding, according to Kahneman.

 

At one point, Kahneman maintains that System 1 is the hero of his narrative, but this seems misleading as much of Thinking Fast and Slow involves stories regarding the fallibility of intuition. Kahneman describes System 1 as follows:

 

  • Operates automatically and quickly with little or no effort and no sense of voluntary control, and rarely takes time off

  • Associated with impressions, feelings and inclinations

  • Intuitive and confident in its intuitions

  • Likes coherent stories and stories with clear causal links, and is quick to produce stories with a minimum of information

  • Infers and invents causes and intentions

  • Neglects ambiguity and suppresses doubts

  • Exaggerates emotional consistency (halo effect)

  • Ignores absent evidence

  • Substitutes easier questions for harder ones 

  • Poor at statistics

 

System 1 in Kahneman’s description is powerful, confident and effortless, and (good news) it can be trained to produce the skilled intuitions of experts (see below). However, System 1 is highly fallible according to Kahneman because it “is radically insensitive to both the quality and quantity of the information that gives rise to impressions and intuitions.”  Kahneman utilizes an acronym, WYSIATI, i.e., What You See Is All There Is, a feature of System 1 that uses available evidence, no matter how thin, to produce coherent plausible stories. WYSIATI is a mechanism for jumping to conclusions.  In fact, lack of evidence may be better than lots of conflicting evidence in facilitating System 1s’ development of a coherent plausible story. 

 

Kahneman maintains that “One of the main functions of System 2 is to monitor and control thoughts and actions “suggested” by System 1, allowing some to be expressed directly in behavior and suppressing or modifying others.” However, according to Kahneman, System 2 draws on a limited pool of mental energy. System 2 is cautious and deliberate, but control of impulses and careful thought require mental energy. System 2, therefore, is careful in its energy investments whereas System 1 is drawing on more than ample energy stores. “The laziness of System 2 is an important fact of life,” Kahneman states.  As a result, System 2 will frequently not make the effort to correct the mistaken intuitions of System 1. In Kahneman’s view, “When an incorrect intuitive judgment is made, System 1 and 2 should both be indicted. System 1 suggested the incorrect intuition, and System 2 endorsed it and expressed it in a judgment.” System 2 may go along with the inclinations of System 1 due to ignorance as well as laziness, but information and understanding have little effect if System 2 will not make an effort to apply what it has learned. Deliberate thought requires effort which is frequently experienced as slightly aversive or highly aversive depending on the difficulty of the task.

 

Given these dynamics, System 2 usually follows the lead of System 1 including endorsement of various shortcuts that lead to biases. One of the biases found by Eileen Munro in her analysis of child maltreatment deaths in England is confirmation bias, i.e., the tendency to pay attention to evidence that supports one’s views while ignoring or rejecting evidence that contradicts current beliefs. When System 1 generates powerful intuitions quickly based on limited evidence (as it often does), System 2 may then generate plausible sounding rationalizations for refusing to consider new information. Furthermore, System 2 may scan the environment actively, seeking additional information that confirms strongly held beliefs. One of the reasons biases are difficult to combat is that, due to the cooperation of Systems 1 and 2, biases are strengthened by new evidence that both Systems seek to confirm beliefs.

 

If one is biased against a person, family or group, there is a strong tendency to notice their questionable behavior and ignore their virtues. Kahneman refers to this bias as the halo effect, i.e., “the tendency to like (or dislike) everything about a person – including things you have not observed …” According to Kahneman, “the halo effect increases the weight of first impressions, sometimes to the point that subsequent information is mostly wasted.” Missing information is “filled by a guess that fits one’s emotional response” to a person or group. Kahneman comments that “The halo effect helps keep explanatory narratives simple and coherent by exaggerating the consistency of evaluations: good people do only good things and bad people are all bad.” Mental shortcuts lead to biases that simplify the world, and reduce the need for mental effort to attend to and understand contradictions and inconsistencies.

 

At this point, readers may wonder if Kahneman is exaggerating the distinction between an athletic, robust, energetic and optimistic System 1 and a weak, nerdy, skeptical, pessimistic and lazy System 2 to increase the dramatic interest of his description of mental operations. For example, Kahneman is a strong advocate of statistical reasoning as a means of understanding the past and predicting the future. One of the most challenging chapters in Thinking Fast and Slow is about the difficulties of teaching the statistical concept of regression to the mean to college students (including graduate students) when its implications conflict with intuitions about “hot streaks” in athletics, or extraordinary seer like abilities to forecast the stock market.

Kahneman offers the following scenario:

 

            Linda is thirty-one years old, single, outspoken and very bright. She

            majored in philosophy.  As a student, she was deeply concerned

            with issues of discrimination and social justice, and also participated

            in antinuclear demonstrations. 

 

           Which alternative is more probable, (1) Linda is a bank teller or (2) Linda

           is a bank teller and is active in the feminist movement?

 

Kahneman states that “About 85-90% of undergraduates at several major universities chose the second option, contrary to logic.” Kahneman comments that many undergraduates were “shameless” in defending their judgment, and that even the famous naturalist Stephen Jay Gould (who knew the correct answer) wrote “a little homonucleus in my head continues to jump up and down , shouting at me – ‘but she can’t just be a bank teller; read the description’.”  Kahneman comments, “The little homonucleus is of course Gould’s System 1 speaking to him in insistent tones.” The ‘representativeness’ heuristic of System 1 generates coherent stories that may not be the most probable.

 

In comments that have important implications for child welfare practice, Kahneman asserts that “Pallid” statistical information is routinely discarded when it is incompatible with one’s personal impressions of a case”, and “our mind is strongly biased toward causal explanations and does not deal well with “mere statistics”.” Risk assessment developers and experts should take note, as well as managers who believe that actuarial risk assessment can be easily combined with family engagement based practice models.

 

Given Kahneman’s acute awareness of the fallibility of intuition, and his strong belief in the predictive value of statistical reasoning, it might be expected that his main practical goal would be to strengthen System 2 while reducing the pretensions of intuitive thinkers. I have difficulty believing Kahneman would not support an increased emphasis on training regarding statistical concepts in psychology, social work and many other professions. However, one of the most surprising and impressive chapters in the book is about Kahneman’s ongoing “adversarial collaboration” with Gary Klein, whom Kahneman describes as “the intellectual leader of an association of scholars and practitioners who do not like the kind of work I do.” Klein is the author of Sources of Power which contains studies of expert fire ground commanders, chess masters and nurses who under extreme time pressures depend on highly skilled intuitions. In Klein’s view, expert intuitions cannot be replaced by algorithms (i.e., formulas) of the sort advocated by Kahneman and other proponents of statistical methods. 

 

It is a credit to both Kahneman and Klein that they have been able to maintain a civil dialogue over a period of several years and eventually largely reach agreement regarding the conditions in which intuitions can become expert and an effective guide to practice. Kahneman and Klein “eventually concluded that our disagreement was due in part to the fact that we had different experts in mind. Klein had spent much time with fire ground commanders, clinical nurses, and other professionals who have real expertise. I had spent time thinking about clinicians, stock pickers and political scientists trying to make unsupportable long-term forecasts.”

 

Kahneman and Klein came to an agreement that skilled intuition was possible only when (a) an environment is sufficiently regular to be predictable and (b) practitioners have an opportunity to learn these regularities through prolonged practice. Kahneman asserts that “The accurate intuitions Gary Klein has described are due to highly valid clues that the expert’s System 1 has learned to use, even if System 2 has not learned to use them.” Kahneman does not consider the possibility that experts can use conceptual frameworks developed by System 2 to guide their intuition. For example, expert CPS investigators can be trained to notice whenever physical abuse is not an instance of excessive physical discipline. Battered child syndrome, torture of a child and serial battering are not discipline, and they require different responses than physical abuse that arises out of disciplinary incidents. Practitioners can be trained to immediately notice this distinction and act accordingly.

 

Nevertheless, when System 2 trains System 1 it must utilize features of situations congruent with intuition. System 1, according to Kahneman, utilizes norms and prototypes and distinguishes the surprising from the normal. For this reason, operationally useful typologies of child maltreatment can be used to train practitioners to recognize standard patterns and anomalous features that don’t fit the usual pattern. For practitioners to acquire these expert skills, they must be exposed to standard patterns over and over again. Kahneman maintains that “After thousands of hours of practice … chess masters are able to read a chess situation at a glance. The few moves that come to mind are almost always strong and sometimes creative. They can deal with a “word” (i.e., chess move) they have never encountered, and they can find a new way to interpret a familiar one.” Similar skills can be acquired in child protection, but usually only after years of experience encountering standard patterns that include poverty, substance abuse, mental health problems and domestic violence along with various combinations of types of child maltreatment. However, it’s difficult to train to expertise of this type when trainers and the experts they listen to are unable to conceptualize its possibility.  

 

Kahneman comments that he and Klein “disagreed less than we had expected and accepted joint solutions of almost all the substantive issues that were raised.” However, they continued to have different attitudes, emotions and tastes “and those changed remarkably little over the years.” Kahneman closes this chapter with the comment, “… finding as much intellectual agreement as we did is surely more important than the persistent emotional differences that remained.”

 

Kahneman tends to be pessimistic about controlling intuitively appealing biases through a more robust System 2. Nevertheless, a careful reading of Thinking Fast and Slow suggests some obvious courses of action for decision makers:

 

  • Whenever possible, slow down and utilize well informed others to reflect on alternatives.

  • Suspend judgment in the initial phases of assessment or investigation; resist quick intuitive judgments and stay open to new information and new perspectives as long as possible.

  • Seek to acquire balanced perspectives of persons and situations; recognize the halo effect (positive and negative) when it appears.

  • Develop an agency environment in which prevailing views of cases, policies and programs can be questioned.

  • Do not allow rank and status to trump rational argument.

  • Do not confuse feelings of subjective certainty and cognitive ease with valid judgments.

  • Seek different viewpoints before (not after) bringing professionals together to discuss a case.

  • Strengthen the capacity to make sense of data and statistical concepts.

  • Make use of algorithms when professional judgment has repeatedly been found wanting.

  • Practice noticing standard patterns and anomalous features in these patterns; work at developing expert intuitions.

  • Give System 2 the opportunity to program System 1, and learn other ways of effectively combining these Systems.

     

Thinking Fast and Slow is a book that requires and repays effort; it contains a vast amount of information about a wide array of subjects.  Kahneman persistently attempts to convince readers (including experts) that human beings are influenced by biases of which we are largely unaware, and that all of us understand ourselves far less than we believe we do. Kahneman’s insistence on this point resembles Freud. The idea “that our minds are susceptible to systematic errors” may be widely accepted by psychologists; but it takes some fortitude to encounter a brilliant theorist and researcher determined to convince readers that we have “excessive confidence in what we think we know and an apparent inability “to acknowledge the full extent of our ignorance and the uncertainty of the world we live in.”       

  

deewilson13@aol.com

    

bottom of page