What I learned from “Thinking Fast and Slow”

What I learned from “Thinking Fast and Slow”

Image for post

I recently finished reading Thinking Fast and Slow, a book on behavioral psychology and decision-making by Daniel Kahneman. This book contains some profoundly important concepts around how people make decisions. It will help you understand why humans sometimes make errors in judgement, and how to look for signs that you yourself may be about to make a System 1 error. Here are some of the most important take-aways from the book.

We have a Two System way of thinking ? System 1 (Thinking Fast), and System 2 (Thinking Slow).

System 1 is the intuitive, ?gut reaction? way of thinking and making decisions. System 2 is the analytical, ?critical thinking? way of making decisions. System 1 forms ?first impressions? and often is the reason why we jump to conclusions. System 2 does reflection, problem-solving, and analysis.

We spend most of our time in System 1.

Most of us identify with System 2 thinking. We consider ourselves rational, analytical human beings. Thus, we think we spend most of our time engaged in System 2 thinking.

Actually, we spend almost all of our daily lives engaged in System 1 (Thinking Fast). Only if we encounter something unexpected, or if we make conscious effort, do we engage System 2 (Thinking Slow). Kahneman wrote:

?Systems 1 and 2 are both active whenever we are awake. System 1 runs automatically and System 2 is normally in comfortable low-effort mode, in which only a fraction of its capacity is engaged. System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings. If endorsed by System 2, impressions and intuitions turn into beliefs, and impulses turn into voluntary actions. When all goes smoothly, which is most of the time, System 2 adopts the suggestions of System 1 with little or no modification. You generally believe your impressions and act on your desires, and that is fine???usually.

?When System 1 runs into difficulty, it calls on System 2 to support more detailed and specific processing that may solve the problem of the moment. System 2 is mobilized when a question arises for which System 1 does not offer an answer? System 2 is activated when an event is detected that violates the model of the world that System 1 maintains.?

So System 1 is continuously creating impressions, intuitions, and judgments based on everything we are sensing. In most cases, we just go with the impression or intuition that System 1 generates. System 2 only gets involved when we encounter something unexpected that System 1 can?t automatically process.

System 1 thinking seeks a coherent story above all else, and often leads us to jump to conclusions.

While System 1 is generally very accurate, there are situations where it can make errors of bias. System 1 sometimes answers easier questions than it was asked, and it has little knowledge of logic and statistics.

One of the biggest problems with System 1 is that it seeks to quickly create a coherent, plausible story???an explanation for what is happening???by relying on associations and memories, pattern-matching, and assumptions. And System 1 will default to that plausible, convenient story???even if that story is based on incorrect information.

?The measure of success for System 1 is the coherence of the story it manages to create. The amount and quality of the data on which the story is based are largely irrelevant. When information is scarce, which is a common occurrence, System 1 operates as a machine for jumping to conclusions.?

WYSIATI: What you see is all there is.

Kahneman writes extensively about the phenomenon of how people jump to conclusions on the basis of limited information. He has an abbreviation for this phenomenon ? WYSIATI ? ?what you see is all there is.? WYSIATI causes us to ?focus on existing evidence and ignore absent evidence.? As a result of WYSIATI, System 1 often quickly creates a coherent and believable story based on limited evidence. These impressions and intuitions can then be endorsed by System 2 and turn into deep-rooted values and beliefs. WYSIATI can cause System 1 to ?infer and invent causes and intentions,? whether or not those causes or intentions are true.

?System 1 is highly adept in one form of thinking ? it automatically and effortlessly identifies causal connections between events, sometimes even when the connection is spurious.?

This is the reason why people jump to conclusions, assume bad intentions, give in to prejudices or biases, and buy into conspiracy theories. They focus on limited available evidence and do not consider absent evidence. They invent a coherent story, causal relationships, or underlying intentions. And then their System 1 quickly forms a judgment or impression, which in turn gets quickly endorsed by System 2.

As a result of WYSIATI and System 1 thinking, people may make wrong judgments and decisions due to biases and heuristics.

There are several potential errors in judgment that people may make when they over-rely on System 1 thinking:

  • Law of small numbers: People don?t understand statistics very well. As a result, they may look at the results of a small sample ? e.g. 100 people responding to a survey ? and conclude that it?s representative of the population. This also explains why people jump to conclusions with just a few data points or limited evidence. If three people said something, then maybe it?s true? If you personally observe one incident, you are more likely to generalize this occurrence to the whole population.
  • Assigning cause to random chance: As Kahneman wrote, ?statistics produce many observations that appear to beg for causal explanations but do not lend themselves to such explanations. Many facts of the world are due to chance, including accidents of sampling. Causal explanations of chance events are inevitably wrong.?
  • Illusion of understanding: People often create flawed explanations for past events, a phenomenon known as narrative fallacy. These ?explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen? Good stories provide a simple and coherent account of people?s actions and intentions. You are always ready to interpret behavior as a manifestation of general propensities and personality traits ? causes that you can readily match to effects.?
  • Hindsight bias: People will reconstruct a story around past events to underestimate the extent to which they were surprised by those events. This is a ?I-knew-it-all-along? bias. If an event comes to pass, people exaggerate the probability that they knew it was going to occur. If an event does not occur, people erroneously recall that they thought it was unlikely.

?Hindsight bias has pernicious effects on the evaluations of decision makers. It leads observers to assess the quality of a decision not by whether the process was sound, but by whether its outcome was good or bad? We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact? When the outcomes are bad, [people] often blame [decision makers] for not seeing the handwriting on the wall? Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.?

  • Confirmation bias: Within WYSIATI, people will be quick to seize on limited evidence that confirms their existing perspective. And they will ignore or fail to seek evidence that runs contrary to the coherent story they have already created in their mind.
  • Overconfidence: Due to the illusion of understanding and WYSIATI, people may become overconfident in their predictions, judgments, and intuitions. ?We are confident when the story we tell ourselves comes easily to mind, with no contradiction and no competing scenario? A mind that follows WYSIATI will achieve high confidence much too easily by ignoring what it does not know. If is therefore not surprising that many of us are prone to have high confidence in unfounded intuitions.?
  • Over-optimism: People have a tendency to create plans and forecasts that are ?unrealistically close to best-case scenarios.? When forecasting the outcomes of risky projects, people tend to make decisions ?based on delusional optimism rather than on a rational weighting of gains, losses, and probabilities. They overestimate benefits and underestimate costs. They spin scenarios of success while overlooking the potential for mistakes and miscalculations? In this view, people often (but not always) take on risky projects because they are overly optimistic about the odds.?

There are many other heuristics and biases that Kahneman describes, including those around evaluating risk and losses.

What did we learn?

  • System 1 (Thinking Fast) often leads individuals to make snap judgments, jump to conclusions, and make erroneous decisions based on biases and heuristics.
  • System 1 is always-on, and constantly producing fast impressions, intuitions, and judgments. System 2 is used for analysis, problem-solving, and deeper evaluations.
  • Most of the time, we go with System 1 recommendations because of cognitive ease. Sometimes, we evoke System 2 when we see something unexpected, or we make a conscious effort to slow down our thinking to take a critical view.
  • System 1 seeks to produce a coherent and believable story based on available information. This often leads us to WYSIATI ? focusing on the limited available evidence, and ignoring important but absent evidence. WYSIATI can lead us to jump to conclusions about people?s intentions, to assign causal relationships when there were none, and to form snap (but incorrect) judgments and impressions.
  • WYSIATI and System 1 thinking can lead to a number of judgment biases, including The Law of Small Numbers, assigning cause to chance, hindsight bias, and overconfidence.

Reading this book has had a profound impact on my own worldview. In the past, I have been taken aback when I observed that someone was ?assuming the worst intentions of others.? I have also struggled to understand how someone could create in their mind such a different narrative of past events, despite seeing the same evidence that I had seen. And finally, I have sometimes been shocked by the biases, prejudices and ?snap judgments? I have seen from others. Thinking Fast and Slow has given me a new perspective on these behaviors and judgments.

I can now apply some of this knowledge to situations where I see people (or when I catch myself) relying too much on System 1 thinking. We will never be able to avoid relying on System 1 thinking for most of our daily lives. The important thing is to recognize when I or when others are relying on it too much, and force more System 2 thinking into the situation.

18