Thursday, February 25, 2021

Book Notes: Thinking, Fast and Slow

Continuing my journey to make better decisions, I decided to read the book Thinking, Fast and Slow by Daniel Kahneman. The author takes us on a groundbreaking tour of the mind and explains the two systems that drive the way we think. System 1 is fast, intuitive, and emotional; System 2 is slower, more deliberative, and more logical.  

Who is this book for?

This book is a recommended read for everyone. It helps us understand how we think and how to avoid falling into traps that lead us to making bad decisions. 

Disclaimer: This post is by no means a summary of the book, I would encourage everyone to go ahead and grab the book and give it a go.

Book Notes


Introduction

When faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution. The spontaneous search for an intuitive solution sometimes fails - neither an expert solution nor an heuristic answer comes to mind. In such cases we often find ourselves switching to a slower, more deliberate and effortful form of thinking. This is the slow thinking. 

Fast thinking includes both variants of intuitive thought - the export and the heuristic - as well as the entirely automatic mental activity of perception and memory, the operations that enable you to know there is a lamp on your desk or retrieve the name of the capital of Russia. The intuitive System 1 is more influential than your experience tells you, and it is the secret author of many of the choices and judgements you make. 

The Characters of the Story

There are two systems in the mind System 1 and System 2. 

  • System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control.
  • System 2 allocates attention to the effortful mental activities that demand it, including complex computations.
System 1 continuously generate suggestions for System 2: impressions, intuitions, intentions and feelings. If endorsed by System 2, impressions and intuitions turn into beliefs, and impulses turn into voluntary actions. When all goes smoothly, which is the most of the time, System 2 adopts the suggestions of System 1 with little or not modification. 

When System 1 runs into difficulty, it calls for System 2 to support more detailed and specific processing that may solve the problem of the moment. System 2 is mobilised when a question arises for which System 1 dos not offer an answer. System 2 takes over when things get difficult, and it normally has the last word. 

The division of labor between System 1 and System 2 is highly efficient: it minimises effort and optimises performance. The arrangement works well most of the time but System 1 has biases, systematic errors that it is prone to make in specified circumstances. Another limitation of System 1 is that it cannot be turned off. 

Conflict between an automatic reaction and an intention to control it is common in our lives. Because System 1 operates automatically and cannot be turned off at will, errors of intuitive thought are often difficult to prevent. Biases cannot always be avoided, because System 2 may have no clue to the error. Even when clues to likely errors are available, errors can be prevented only by the enhanced monitoring and effortful activity of System 2.

Attention and Effort


The response to mental overload is selective and precise: System 2 protects the most important activity, so it receives the attention it needs; "spare capacity" is allocated second by second to other tasks. The sophisticated allocation of attention has been honed by a long evolutionary history. Orienting and responding quickly to the gravest threats or most promising opportunities improved the chances of survival. 

As you become skilled in a task, its demand for energy diminishes. Studies of the brain have shown that the pattern of activity associated with an action changes as skill increases, with fewer brain regions involved. A general "low of least effort" applies to cognitive as well as physical exertion. People will eventually gravitate to the least demanding course of action. In the economy of action, effort is a cost, and the acquisition of skill is driven by the balance of benefits and costs. Laziness is built deep into our nature. 

System 2 is the only one that can follow rules, compare objects on several attributes, and make deliberate choices between options. The automatic System 1 does not have these capabilities. A crucial capability of System 2 is the adoption of "task sets": it can program memory to obey an instruction that overrides habitual responses. 

The Lazy Controller


The maintenance of a coherent train of thought and the occasional engagement in effortful thinking also require self-control. Frequent switching of tasks and speeded-up mental work are not intrinsically pleasurable, and that people avoid them when possible. This is how the law of least effort comes to be a law. A state of effortless concentration so deep that people lose their sense of time is called as flow. Flow neatly separates the two forms of effort: concentration on the task and the deliberate control of attention.  

System 1 has more influence on behaviour when System 2 is busy. People who are cognitively busy are also more likely to make selfish choices, use sexist language, and make superficial judgements in social situations. Self-control requires attention and effort. 

An effort of will or self-control is tiring; if you have had to force yourself to do something, you are less willing or less able to exert self-control when the next challenge comes around. This is called ego depletion. After exerting self-control in one task, you do not feel like making an effort in another, although you could do it if you really had to. Ego depletion is not the same mental state as cognitive busyness.

Those who avoid the sin of intellectual sloth could be called engaged. They are more alert, more intellectually active, less willing to be satisfied with superficially attractive answers, more skeptical about their intuitions. The psychologist would call them more rational. System 1 is impulsive and intuitive; System 2 is capable of reasoning, and it is cautious, but at least for some people it is also lazy

The Associative Machine


System 1 provides the impressions that often turn into your beliefs, and is the source of the impulses that often become your choice and your actions. It offers a tacit interpretation of what happens to you and around you, linking the present with the recent past and with expectations about the near future. It contains the model of the world that instantly evaluates events as normal or surprising. It is the source of your rapid and often precise intuitive judgements. It does most of this without your conscious awareness of its activities. System 1 is also the origin of many of the systematic errors in your intuitions. 

Cognitive Ease


Cognitive Ease has a range between Easy and Strained. Easy is a sign that things are going well. Strained indicates that a problem exists, which will require increased mobilisation of System 2. This is called cognitive strain. Cognitive strain is affected by both the current level of effort and the presence of unmet demands. When you are in the state of cognitive ease, you are probably in a good mood, like what you see, believe what you hear, trust your intuitions, and feel that the current situation is comfortably familiar. You are also likely to be relatively casual and superficial in your thinking. When you feel strained, you are more likely to be vigilant and suspicious, invest more effort in what you are doing, feel less comfortable, and make fewer errors, but you also are less intuitive and less creative than usual. 

The experience of familiarity has a simple but powerful quality of "pastness" that seems to indicate that it is a direct reflection of prior experience. The familiarity of one phrase in the statement sufficed to make the whole statement fell familiar, and therefore true. People can overcome some of the superficial factors that produce illusions of truth when strongly motivated to do so. But on most occasions, the lazy System 2 will adopt the suggestions of System 1 and march on. 

A good mood is a signal that things are generally going well, the environment is safe, and it is all right to let one's guard down. A bad mood indicates that things are not going very well, there may be a thread, and vigilance is required. Cognitive ease is both a cause and a consequence of a pleasant feeling. 

Norms, Surprises, and Causes


A large event is supposed to have consequences, and the consequences need causes to explain them. We have limited information about what happened on a day, and System 1 is adept at finding a coherent causal story and links the fragments of knowledge at its disposal. 

People are prone to apply casual thinking inappropriately, to situations that require statistical reasoning. Statistical thinking derives conclusions about individual cases from properties of categories and ensembles. Unfortunately, System 1 does not have capability for this mode of reasoning; System 2 can learn to think statistically, but few people receive the necessary training. 

A machine for Jumping to Conclusions


When uncertain, System 1 bets on an answer, and the bests are guided by experience. The rules of the betting are intelligent: recent events and the current context have most weight in determining an interpretation. When no recent event comes to mind, more distant memories govern. Jumping to conclusions is efficient if the conclusions are likely to be correct and cost of an occasional mistake acceptable, and if the jump saves much time and effort. Jumping to conclusions is risky when the situation is unfamiliar, the stakes are high, and there is no time to collect more information. These are the circumstances in which intuitive errors are probable, which may be prevented by a deliberate intervention of System 2. 

How Judgments Happen


System 1 carries out many computations at any one time. Some of these are routine assessments that go on continuously. No intention is needed to trigger this operation. But there are other computations which are undertaken only when needed. The occasional judgements are voluntary. They occur only when you intend them to do so. However, the control over intended computations is far from precise: we often compute much more than we want or need. This excess computation is called the mental shotgun

Answer an Easier Question


If a satisfactory answer to a hard question is not found quickly, System 1 will find a related question that is easier and will answer it. This operation of answering one question in place of another is called substitution. the target question is the assessment you intend to produce. The heuristic question is the simpler question that you answer instead. 

The Law of Small Numbers


The law of small numbers is a manifestation of a general bias that favours certainty over doubt. System 1 runs ahead of the facts in constructing a rich image on the basis of scraps of evidence. A machine for jumping to conclusions will act as if it believed in the law of small numbers. It will produce a representation of reality that makes too much sense. 

Anchors


Anchoring effect occurs when people consider a particular value for an unknown quantity before estimating that quantity. The estimates stay close to the number that people considered. The effects of random anchors have much to tell us about the relationship between System 1 and System 2. System 2 works on data that is retrieved from memory, in an automatic and involuntary operation of System 1. System 2 is therefore susceptible to the biasing influence of anchors that make some information easier to retrieve. Furthermore, System 2 has no control over the effect and no knowledge of it. 

The Science of Availability


The availability heuristic, like other heuristics of judgement, substitutes one question for another: you wish to estimate the size of a category or the frequency of an event, but you report an impression of the ease with which instances come to mind. Substitution of questions inevitably produces systematic errors. The ease with which instances come to mind is a System 1 heuristic, which is replaced by a focus on content when System 2 is more engaged. 

Availability, Emotion, and Risk


All heuristics are equal, but availability is more equal than the others. An availability cascade is a self-sustaining chain of events, which may start from media reports of a relatively minor event and lead up to public panic and large scale government action. We have a basic limitation in the ability of our mind to deal with small risks: we either ignore them altogether or give them far too much weight - nothing in between. 

Tom W's Speciality


System 1 generate an impression of similarity without intending to do so. We rely on representativeness when we judge the potential leadership of a candidate for office by the shape of his chin or the forcefulness of his speeches. Prediction by representativeness is not statistically optimal. 

There are two ideas to keep in mind about Bayesian reasoning and how we tend to mess it up. The first is that base rates matter. The second is that intuitive impressions of the diagnosticity of evidence are often exaggerated. The combination of WYSIATI - (What You See Is All There Is) and associative coherence tends to make us believe in the stories we spin for ourselves. 

Linda: Less is More


The word fallacy is used, in general, when people fail to apply a logical rule that is obviously relevant. Conjunction fallacy, is what people commit when they judge a conjunction of two events in direct comparison. Representativeness belongs to a cluster of closely related basic assessments that are likely to be generated together. The most representative outcomes combine with the personality description to produce the most coherent stories. The most coherent stories are not necessarily the most probable, but they are plausible, and the notions of coherence, plausibility and probability are easily confused by the unwary. 

In most situations, a direct comparison makes people more careful and more logical. But not always, sometimes intuition beats logic even when the correct answer stares you in the face

Causes Trump Statistics


Statistical base rates are facts about a population to which a case belongs, but they are not relevant to the individual case. Causal base rates change your view of how the individual case came to be. The two types of base rate information are treated differently: 
  • Statistical base rates are generally underweighted, and sometimes neglected altogether. 
  • Casual base rates are treated as information about the individual case and are easily combined with other case specific information. 

Regression to the Mean


Regression to the mean is due to random fluctuations in the quality of performance. Regression effects are ubiquitous, and are so misguided casual stories to explain them. Our mind is strongly biased towards causal explanations and does not deal well with "mere statistics". When our attention is called to an event, associative memory will look for its cause-more precisely, activation will automatically spread to any cause that is already stored in memory. Causal explanations will be evoked when regression is detected, but they will be wrong because the truth is that regression to the mean has an explanation but does not have a cause

Taming Intuitive Predictions


We are capable of rejecting information as irrelevant or false, but adjusting for smaller weaknesses in the evidence is not something that System 1 can do. As a result, intuitive predictions are almost completely insensitive to the actual predictive quality of the evidence. When a link is found, WYSIATI applies: your associative memory quickly and automatically constructs the best possible story from information available. 

Intensity matching yields predictions that are extreme as the evidence on which they are based, leading people to give the same answer to two quite different questions

Correcting your intuitive predictions is a task for System 2. Significant effort is required to find the relevant reference category, estimate the baseline prediction, and evaluate the quality of the evidence. The effort is justified only when the stakes are high and when you are particularly keen not to make mistakes. Furthermore, you should know that correcting your intuitions may complicate your life. A characteristic of unbiased predictions is that they permit the predictions of rare or extreme events only when information is very good. If you expect your predictions to be of modest validity, you will never guess an outcome that is either rare or far from the mean. 

The Illusion of Understanding


A compelling narrative fosters an illusion of inevitability. The core of illusion is that we believe we understand the past, which implies that the future also should be knowable, but the fact we understand the past less than we believe we do. 

Hindsight bias has pernicious effects on the evaluations of decision makers. It leads observers to assess the quality of a decision not by whether the process was sound but by whether its outcome was good or bad

The Illusion of Validity


Subjective confidence in a judgment is not a reasoned evaluation of the probability that the judgment is correct. Confidence is a feeling, which reflects the coherence of the information and the cognitive ease of processing it. It is wise to take admissions of uncertainty seriously, but declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true. 

Intuitions vs. Formulas


To maximise predictive accuracy, final decisions should be left to formulas, especially in low-validity environments. Whenever we can replace human judgment by a formula, we should at least consider it. 

Expert Intuition: When Can We Trust It?


Intuition is nothing more and nothing less than recognition. The mystery of knowing without knowing is not a distinctive feature of intuition; it is the norm of mental life. The confidence that people have in their intuitions is not a reliable guide to their validity. In other words, do not trust anyone-including yourself-to tell you how much you should trust their judgment. Intuition cannot be trusted in the absence of stable regularities in the environment.  

The Outside View


When forecasting the outcomes of risky projects, executives too easily fall victim to the planning fallacy. In its grip, they make decisions based on delusional optimism rather than on a rational weighting of gains, losses and probabilities. They overestimate benefits and underestimate costs. 

The Engine of Capitalism


The people who have greatest influence on the lives of others are likely to be optimistic and overconfident, and to take more risks than they realise. When action is needed, optimism, even of the mildly delusional variety, may be a good thing. The optimistic risk taking of entrepreneurs surely contributes to economic dynamism of a capitalistic society, even if most risk takers end up disappointed. 

The consequence of competition neglect is excess entry: more competitors enter the market than the market can profitably sustain, so their average outcome is a loss. The outcome is disappointing for the typical entrant in the market, but the effect on the economy as a whole could well be positive. 

Premortem has two main advantages: it overcomes the groupthink that effects many teams once a decision appears to have been made, and it unleashes the imagination of knowledgeable individuals in a much-needed direction. The main virtue of the premortem is that it legitimises doubts. It encourages even supporters of the decision to search for possible threats that they had not considered earlier. 

Prospect Theory


In the utility theory, the utility of a gain is assessed by comparing the utilities of two states of wealth. Many of the options we face in life are "mixed": there is a risk of loss and an opportunity for gain, and we must decide whether to accept the gamble or reject it. In the mixed case, the possible loss looms twice as large as the possible gain. In the bad case we become a lot more risk seeking. The prospect theory and utility theory also fail to allow for regret. The two theories share the assumption that available options in a choice are evaluated separately and independently, and that the option with the highest value is selected. This assumption is certainly wrong.

The Endowment Effect


Tastes are not fixed' they vary with reference point. Second, the disadvantages of a change loom larger than its advantages, inducing a bias that favours the status quo.

Bad Events


Loss aversion refers to the relative strength of two motives: we are driven more strongly to avoid losses than to achieve gains. A reference point is sometimes the status quo, but it can also be a goal in the future: not achieving a goal is a loss, exceeding the goal is a gain. the aversion of the failure of to reaching the goal is much stronger than the desire to exceed it. 

The Fourfold Pattern


The decision weights that people assign to outcomes are not identical to the probabilities of these outcomes, contrary to the expectation principle. Improbable outcomes are overweighted-this is the possibility effect. Outcomes that are almost certain are underweighted relative to actual certainty. The expectation principle, by which values are weighted by their probability, is poor psychology. 

People attach values to gains and losses rather than to wealth, and the decision weights that they assign to outcomes are different from probabilities. This is called the fourfold pattern. 



Rare Events


The emotional arousal is associative, automatic, and uncontrolled, and it produces an impulse for protective action. System 2 may know that the probability is low, but this knowledge does not eliminate the self-generated discomfort and the wish to avoid id. System 1 cannot be turned off. The emotion is not only disproportionate to the probability, it is also insensitive to the exact level of probability. The hypothesis suggest that the focal attention and salience contribute to both the overestimation of unlikely events and the overweighting of unlikely outcomes.

Risk Policies


It is costly to be risk averse for gains and risk seeking for losses. The attitudes make you willing to pay a premium to obtain a sure gain rather than face a gamble, and also willing to pay a premium to avoid a sure loss. Both payments come out of the same pocket, and when you face both kinds of problems at once, the discrepant attitude are unlikely to be optimal. 

Decision makers who are prone to narrow framing construct a preference every time they face a risky choice. They would do better by having a risk policy that they routinely apply whenever a relevant problem arises. A risk policy is a broad frame that embeds a particular risky choice in a set of similar choices. The outside view and the risk policy are remedies against two distinct biases that affect many decisions: the exaggerated optimism of the planning fallacy and the exaggerated caution induced by loss aversion. The two biases oppose each other. Exaggerated optimism protects individuals and organisations from paralysing effects of loss aversion;  loss aversion protects them from the follies of overconfident optimism. 

Keeping Score


The disposition effect is an instance of narrow framing. The decision to invest additional resources in a losing account, when better investments are available, is known as the sunk-cost fallacy, a costly mistake that is observed in decisions large and small. 

Decision makers know that they are prone to regret, and the anticipation of that painful emotion plays a part in many decisions. Intuitions about regret are remarkably uniform and compelling. The key is not the difference between commission and omission but the distinction between default options and actions that deviate from the default. When you deviate from the default, you can easily imagine the norm-and if the default is associated with bad consequences, the discrepancy between the two can be the source of painful emotions. 

Reversals


The emotional reactions of System 1 are much more likely to determine single evaluation; the comparison that occurs in join evaluation always involves a more careful and effortful assessment, which calls for System 2. Rationality is generally served by broader and more comprehensive frames, and joint evaluation is obviously broader than single evaluation. Of course, you should be wary of join evaluation when someone who controls what you see has a vested interest in what you choose. 

Frames and Reality


Tendencies to approach or avoid are evoked by the words, and we expect System 1 to be biased in favour of the sure option when it is designed as KEEP and against that same option when it's designated as LOSE. Reframing is effortful and System 2 is normally lazy. Unless there is an obvious reason to do otherwise, most of us passively accept decision problems as they are framed and therefore rarely have any opportunity to discover the extent to which our preference are frame-bound rather than reality-bound.

Two Selves


System represents sets by averages, norms, and prototypes, not by sums. We want pain to be brief and pleasure to last. But our memory, a function of System 1, has evolved to represent the most intense moments of an episode of pain or pleasure (the peak) and the feelings when the episode was at its end. A memory that neglects duration will not serve our preference for long pleasure and short pains. 

Life as a Story


Caring for people often takes the form of concern for the quality of their stories, not their feelings. Indeed, we can be deeply moved even by events that change the stories of people who are already dead.

Experienced Well-Being


The percentage of time that an individual spends in an unpleasant state is called the U-index. An individual's mood at any moment depends on her temperament and overall happiness, but emotional well-being also fluctuates considerably over the day and the week. The mood of the moment depends primarily on the current situation. 

Thinking About Life


The graph shows the level of satisfaction reported by people around the time they got married. 




This graph reliably evokes nervous laughter from people, the nervousness is easy to understand: after all, people who decide to get married do so either because they expect it will make them happier or because they hope that making a tie permanent will maintain the resent state of bliss. On their wedding day, the bride and the groom know that the rate of divorce is high and that they incidence of marital disappointment is even higher, but they do not believe that these statistics apply to them. This is called affective forecasting. 

Any aspect of life to which attention is directed will loom large in a global evaluation. This is the essence of focus illusion. The word mis-wanting describes bad choices that arise from errors of affective forecasting. The focus illusion is a rich source of mis-wanting. In particular, it makes us prone to exaggerate the effect of significant purchases or changed circumstances on our future well-being. The focusing illusion creates a bias in favour of goods and experiences that are initially exciting, even if they will eventually lose their appeal. 


Conclusion 


I feel this book needs to be read multiple times to fully grasp all the learnings from it. It might feel like a big book, but stick to it and finish it, its totally worth your time!

Have some Fun!