cft

Summary – Thinking Fast and Slow

The plot of his book is how to, “recognize situations in which mistakes are likely and try harder to avoid significant mistakes when stakes are high.”


user

Nikhil M. Varshney

3 years ago | 16 min read

Description and Subject Matter

Thinking fast and slow in based on two systems of decision making. System 1 which is fast and is responsible for intuitive decisions based on emotions, imagery and associative memory. System 2, which is a slow system and intervenes when output of system 1 is insufficient, infringe or less rational.

System 1 = The instant, unconscious, automatic, emotional, intuitive thinking.
System 2 = The slower, conscious, rational, reasoning, deliberate thinking.

Thinking fast and slow by Daniel Kahneman provides an insight and understanding for topics like psychology, perception, irrationality, decision making, errors of judgment, cognitive science, intuition, statistics, uncertainty, illogical thinking, stock market gambles, and behavioral economics.

Author further uses the example of prospect theory to demonstrate decision making under risk and uncertainty.  The book shows that our intuition is biased and we assume certain things without having thought through them carefully. Author calls these assumptions heuristics.

He shows that certain heuristics lead to muddled thinking, and gives each a name such as “halo effect,” “availability bias,” “associative memory,” and so forth.” Kahneman reveals where we can and cannot trust our intuitions and how we can tap into the benefits of slow thinking. He offers practical and enlightening insights into how choices are made in both our business and our personal lives—and how we can use different techniques to guard against the mental glitches that often get us into trouble.

The plot of his book is how to, “recognize situations in which mistakes are likely and try harder to avoid significant mistakes when stakes are high.”

Organization of the book:

The book is divided into five sections namely two systems, heuristics and biases, overconfidence, choices and two shelves. (Detailed explanation of each section is provided in section below)

Two systems: Book initially describes two system, system 1 and system 2 by giving examples of how and when both are used by us and how they arrive at different results even given the same input.

Heuristics and Biases: This section explains why we struggle to think statistically. Kahneman explains this phenomenon using heuristics and a landmark article from 1974 about judgement under uncertainties.

Overconfidence: This section explains how flawed explanatory stories of the past shape our view of the world and expectation of the future. In this section he talks about luck, where more importance is given to talent, intentions and stupidity. Kahneman talks about subjective confidence and statistical algorithms that should form the basis for a decision.

Choices: Kahneman uses economics as the basis of explanation of his work on prospect theory. He further expresses his thoughts on how problems should be addressed in isolation and how other reference points can used to gather more information. This section also advices on how to overcome the shortcomings of system 1.

Two Shelves: Kahneman put forward the argument that like two systems clash in the mind, the two shelves (experience and remember) clash over the quality of experiences. The experiencing self is a part that lives the life and remembering self is a part that evaluates these experiences, draws lessons from them and make decisions for the future.

Major Findings: (Note: Major findings and learnings are highlighted in bold and underlined, italics are references from the book)

Thinking Fast and Slow is a comprehensive survey book.  It provides a foundational understanding and an outstanding summary of the current state of research into the psychology of decision making. It is really interesting – and I found quite encouraging and affirming – to learn how much we can impact on and influence this, and how much of a difference the small things we do, think and say. Some of the major takeaway points and summary of the book are:

Part I: Two Systems

Any time a decision is to be made, our mind applies two system, system 1 and 2, to it. System 1 operates on heuristics and may not be accurate while system 2 requires effort evaluating those heuristics and is error prone. Because of this we let system 1 take prime control to make decisions.

Conscience and sub-conscience mind exposure to an idea primes us to and to an associative idea called priming. For example, when we are thinking about food, the blank SO_P will be filled with ‘U’ but when we are thinking about cleanliness the blank would be filled with ‘A’. Author calls this effect as ideomotor effect (pg 53).

Kahneman states that we tend to make associations between events, circumstances and occurrences called cognitive and associative ease. The more these things fit into our stories the more real they seem. We explain them using 1) assuming intention, “It was meant to happen,” 2) causality, “They’re homeless because they’re lazy,” or 3) interpreting providence, “There’s a divine purpose in everything.” (pg 76).

Also we often believe things that are easier to compute, familiar and easy to read seem truer than things that require hard through, are novel or are hard to see. Due to this we make more out of coincidences that is statistically available and we never come around answering the harder question.

Based on above mentioned conditions and heuristics we have the tendency to jump to conclusions. To counter this author coins another term called confirmation bias which is the tendency to search for and find confirming evidence for a belief while overlooking counter examples. Author states another reason for jumping to conclusions as halo effect. Halo effect is the tendency to dislike or like a person, including things we have not observed (pg 82).

Good first impression tend to cover the later negative and vice-versa. The problem with our intuitive judgements are that they are impulsive, at this stage to remind system 1 to not to jump to conclusions and to enlist the evaluative skills of system 2, Kahneman coined the term WYSIATI, What You See Is All There Is, meaning to stay focused on the hard data before us

System 1 has characteristic of substitution. When we are confronted with difficult questions, we have the tendency to substitute to alternatives questions which are easier to understand and answer. For example, when asked about what is happiness, we generally relate it to the current mood. Kahneman further states that our likes and dislikes affect our decision and theories. We seem to believe what we like and vice-versa. In short Emotions affects our judgments.

Part II: Heuristics and Biases

System 1 tends to automatically assign causal relations and neglects statistics, so the larger variability of small populations leads to sample size artifacts which Kahneman states as law of small numbers.

Law of small numbers states that we tend to believe in statistics that has a small sample size. Kahneman states that this should not happen as small numbers do not reflect the complete picture. We err when we intuit rather than compute. We make decision on insufficient data. Further we have tendency to make stories out of scraps. System 2 doubts these stories but this requires a lot of work so lazy system 2 sometimes fails to do its job. Anchoring effect is another heuristic which makes incorrect estimates due to previously heard quantities.

For example, 35mph may seem fast when someone previously drove at 10mph but may seem slow if you previously drove at 65 mph. Potential source of error are making connections where none exists and being more suggestable than we realize.

Also, we are prone to give bigger answers to the questions to which answers are easier to retrieve when we had an emotional personal experience attached with question. Kahneman calls this as availability heuristics.

Hence we are more dependent on the retrieval data than on statistical data. Similar to profiling or stereotyping, “representativeness” is the intuitive leap to make judgments based on how similar something is to something we like without taking into consideration other factors: probability, statistics, or sampling sizes.

We fail to analyzed things on their salient features as most of the time we are compare them with similar things, places or persons. To highlight the above fact author uses Linda’s example. The objective is to show the logic of probability. People believed a plausible story over a probable story. Linda a hypothetical character was created. After reading priming description of Linda, respondents were more likely to give her two characteristics, which is statistically improbable.

When given statistical data we generally make correct decision but when a story is added which explain things we tend to go with story rather than statistics hence overlook statistics. We generally make stereotype and general decision from particular inferences rather than making particular inferences from general cases. Our mind is strongly biased toward causal explanations and does not deal well with mere statistics. Conclusions we draw with strong intuition breeds over confidence. We fail to understand that just because a things feel right does not make it right. Here, we need system 2 to slow down and examine our estimates, check regression to the mean, evaluate quality of evidence etc. “Extreme predictions and a willingness to predict rare events from weak evidence are both manifestations of System 1. It is natural for the associative machinery to match the extremeness of predictions to the perceived extremeness on which it is based—this is how substitution works,” (pg 194)

Part III: Over Confidence

In an attempt to make sense of the world we often create flawed explanatory stories of the past that shape our views of the world and expectations of the future. We think we understand the past, which implies the future should be knowable, but in fact we understand the past less than we believe we do. “We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact.”(Pg 203).

We overlook statistical information and favor our gut feelings. We make decisions based on delusion optimism rather than rationally. We believe the outcomes of our events truly lies in our hands neglecting luck. We don’t appreciate the uncertainty of the environment.

“Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors, who are better able to gain the trust of clients,” (pg 263). We believe in unwarranted optimism which does not calculate the odds and hence are risky, which is a mistake.

Part IV: Choices

We often think every object has only intrinsic objective value. We make decisions purely on logic without considering psychological states. When we believe in a theory it is very difficult to find flaws in that theory. “If you come upon an observation that does not seem to fit the model, you assume that there must be a perfectly good explanation that you are somehow missing,” (pg 277).

Kahneman developed a prospect theory to counter the view of economists who thought that money is the sole determinant in explaining why people spend, buy and gamble the way they do. Prospect theory explains that the value of money is less important than subject experience of changes in one’s wealth. Author provides an example stating, we experience diminishing sensitivity to changes in wealth.

Loosing $100 will hurt more if we start $200 but less if we start with $1000. We hate losing money. “You just like winning and dislike losing—and you almost certainly dislike losing more than you like winning,” (pg 281).  People generally work hard to minimize losses than to gain more and underestimate our own and other’s attitude towards loss or gain.

An object we own and use is more valuable to us than the object that we don’t use or own. We cling to object for sentimental reason at a considerable loss of income. People generally attach more weights to gains and losses rather than wealth. Kahneman explains this through 1) Possibility Effect, where highly unlikely outcomes are given more weight than they deserve. 2) Certainty Effect, Outcomes that are less certain are given less weight. 3) Expectation principle, “decision weights that people assign to outcomes are not identical to the probabilities of these outcomes, contrary to the expectation principle” (pg 312).

Due to these factors we become highly sensitive to risks. Kahneman says one way to reduce risk is to think broadly, like calculating the estimated wins over many small gambles. This is a non-intuitive job and hence requires system 2 to operate but we are wired to system 1 and think irrationally and loose extra money. Hence we should learn to pass risks in our favor.

Kahneman further describes another heuristics of ours as the disposition effect, in which we are often will to sell money making stock because that makes us feel like potential and learned investor while on the other side we keep the losing stock because selling them is admission of defeat.

We make decision differently when asked to take a decision in isolation than when asked to make a decision in comparison to other scenarios. “Joint evaluations highlights a feature that was not noticeable in single evaluations but is recognized as a decisive when detected,” (pg 359). We should always make a decision in comparison whether it be buying products online or comparing salaries for different jobs. Failure to do so limits our exposure to helpful norms.

Part IV: The Two Shelves

Everyone has two shelves, experiencing and remembering. Out of the two latter usually takes precedence. “Confusing experience with the memory of it is a compelling cognitive illusion—and it is the substitution that makes us believe a past experience can be ruined. The experiencing self does not have a voice,” (page 381).

Kahneman then explains the “peak end rule”, where we only remember what the experience felt at its end not at the worst or best moment. We pay more attention to longevity or quantity rather than quality. It’s absurd that people willingly choose more pain for longer periods of time that end pleasantly over periods of less pain of shorter duration and end terribly.

Major learning from the topic is not paying attention to what we are doing, letting experience happen without reflection and going with the flow with no attempt to alter our schedule and experiences.

“The way to block errors that originate in System 1 is simple in principle: recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2,” (p 417).

Understanding of Decision Models through Thinking Fast and Slow

Thinking, Fast and Slow is a trip through a lifetime of research in judgment and decision-making by one of the world’s preeminent scholars. Kahneman suggests that intuition plays a very important role in decision making be it an expert or ordinary individual.

In this book Kahneman suggest various ways the decision making process can be improved by improving the quality of intuition.

Kahneman further states that slowing down system 2 helps make better decisions. Various measures suggested by the authors to improve the intuition are described in the section above.

Decision models is all about SMART, correct probabilities, risk and uncertainties, heuristics and biases and finally combing all these factors to make a decision. Some of the techniques that enrich the flavor of decision modelling listed in the book are:

Jumping to the conclusion: Certain decision fail because we have the tendency to jump to conclusions. To block error that originate in system 1 is to slow down and ask for reinforcement from system 2. This is why organizations are better than individuals because they think slow and have the power to impose orderly procedures.

To bring system 2 into action every time we face a decision is difficult and practically not feasible. The best is to compromise and realize the situation where the decision to be made requires system 2 to take action.

Statistics: Overestimating the probabilities of an unlikely event and underestimating the probability of a likely event increases the chance of making a poor decision. Hence to understand the statistics and choosing an appropriate sample size is extremely important to make a good decision. To maximize predictive accuracy, final decisions should be left to formulas, especially in low-validity environments. The decision shall always be under rational weights rather than delusional optimism.

Experts and Risks: Experts rely on individual skill, feedback and practice. We shall not put much trust in the judgement of an expert in the field where challenges vary greatly, where luck determines success and where there is a big gap between action and feedback. Because system 1 of experts provides them with quick answers to difficult questions and at the same time our system 2 will not be able to detect this mistake.

We tend to misjudge rare events or give too much weight while making decisions. Hence making a good decision depends on paying attention to where the information comes from, understanding how it is formed, accessing the confidence in making a prediction and validating the source.

The deliberate avoidance of exposure to short-term outcomes improves the quality of both decisions and outcomes. The outside view and the risk policy are remedies against two distinct biases that affect many decisions: the exaggerated optimism of the planning fallacy and the exaggerated caution induced by loss aversion.

Fear governs decisions. If we apply these lessons to workplace leaning and performance, we can easily see not only the need to identify and correct misperceptions early in the learning process, but also the need to instill in ourselves and our learners an awareness of how easily we can be affected and influenced by stressful, emotional situations and inaccurate sources of information with little regard to their veracity.

Connections with personal life

There are various topics to which I could personally relate my experiences especially the Halo effect and not following the statistics and algorithm. I generally form a decision based on certain characteristics of an object without going through all of them.

They were not only related to physical objects but also judging a character of a person. Reading TFS has surely changed my outlook about how to form a decision.

Professionally, while working in Bechtel Corporation, I had to make certain decision with uncertain information. A general trend was to follow the assumptions taken in the past projects and proceed as the information comes at a very later stage which may delay the schedule of the project.

After reading TFS I can relate to an incident where the decision proved to be costly when the selected pump based on past parameters did not meet the static head requirements and cavitation occurred. Reading TFS made me realize had the parameters and reference were verified and the opinion of experts was not accepted blindly this situation could have avoided.

Another incident was of sizing the relief vent on nitrogen storage tank where a similar project had 1” safety vent, but after calculation the vent size was confirmed to 3” for the new project. 1” was the standard design that was used. However I checked the calculation independently keeping other salient facts open and saved a huge cost to the company.

Discussion beyond the book

Examples from the book can be directly taken and applied in personal and work life experiences.

Concepts discussed in the book can surely help EPC (engineering, procurement and construction) companies to evaluate the uncertainties in the price of a project. Where experts make very crucial decisions and the manager has to trust the decision of these experts.

This book will help in warranting the decision taken by experts by arising system 2 of the manager. Along with expert opinions, statistical data and probability analysis will surely help managers reach a better decisions.

There is a lot of insight that HR professional can take from the book which will help them in allocating and deciding performance bonuses. A true justification can be achieved of how the bonuses are distributed. Concept of illusion of understanding and validity can be applied here. Fields such as neuroscience can surely benefit from proper decision making about patients. 

Entertainment value

The book is difficult to start and get accustomed to, but once you get the grip on the topic, it becomes really intriguing. I had planned to read this book in one go, but realized it requires a slow read. Complex topics have been explained with easy example which can be correlated to the real world. I would give it 8/10.

Recommendation The topics that Nobel Prize winner Daniel Kahneman addresses are both complex and integral to the human mind: He asks us to think about thinking by considering how your mind habitually contradicts itself, distorts data and misleads us. His prose is lucid, his reasoning rigorous and his honesty refreshing – more than once Kahneman illustrates conflicted thinking with examples from his own life.

It is a fairly slow read, but an ultimately rewarding experience. I would recommends this book to anyone interested in improving their thinking about thinking. Industrial engineers can grasp a lot concepts about decision making which will surely help them become better managers and leaders in future. This book has rearranged the way I thought about making decisions.

Upvote


user
Created by

Nikhil M. Varshney

Finding the right passion that drives your growth is critical. Building and scaling products that enable organizations to gain users is my passion. After 8 years of creating products and delivering growth, I have exposure across B2B, B2C, and B2B2C organizations. Right now, I lead product management for intelligence (artificial intelligence and machine learning) at Athena Health. It’s a $17B dollar company recently acquired by Hellman & Friedman and Bain Capital in the largest ever levered buyout. At Athena, I am helping our customers increase access to care by adding more appointments on calendars and reducing wait times. My passion for building products started with my first venture in healthcare where I developed an enrollment application that increased membership by 2.5 million members. I joined HM Health Solutions (HMHS) right after my master’s. HMHS is a sister company to Highmark, Pennsylvania’s largest insurer. It is a product company with more than 15M healthcare members and


people
Post

Upvote

Downvote

Comment

Bookmark

Share


Related Articles