Daniel Kahneman - Thinking, Fast and Slow - 2011

History / Edit / PDF / EPUB / BIB /
Created: February 8, 2018 / Updated: July 24, 2025 / Status: finished / 35 min read (~6909 words)
psychology

  • Many of us spontaneously anticipate how friends and colleagues will evaluate our choices
  • Systematic errors are known as biases
  • We are often confident even when we are wrong, and an objective observer is more likely to detect our errors than we are

  • The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition - Herbert Simon
  • Intuitive heuristics: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution

  • System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control
  • System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration
  • The gorilla study illustrates two important facts about our minds: we can be blind to the obvious, and we are also blind to our blindness

  • System 1 and 2 are both active whenever we are awake
  • System 2 is normally in a comfortable low-effort mode, in which only a fraction of its capacity is engaged
  • System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings
  • System 1 has biases, systematic errors that it is prone to make in specified circumstances
    • It sometimes answers easier questions than the one it was asked
    • It has little understanding of logic and statistics
  • One limitation of System 1 is that it cannot be turned off
  • To resist the illusion, there is only one thing you can do: you must learn to mistrust your impressions of the length of the lines when fins are attached to them
  • Even when cues to likely errors are available, errors can be prevented only by the enhanced monitoring and effortful activity of System 2
  • Constantly questioning our own thinking would be impossibly tedious, and System 2 is much too slow and inefficient to serve as a substitute for System 1 in making routine decisions. The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high

  • There are vital tasks that only System 2 can perform because they require effort and acts of self-control in which the intuitions and impulses of System 1 are overcome
  • The response to mental effort is distinct from emotional arousal
  • Much like the electricity meter outside your house or apartment, the pupils offer an index of the current rate at which mental energy is used
  • The response to mental overload is selective and precise: System 2 protects the most important activity, so it receives the attention it needs; "sparse capacity" is allocated second by second to other tasks
  • As you become skilled in a task, its demand for energy diminishes
  • Highly intelligent individuals need less effort to solve the same problems, as indicated by both pupil size and brain activity
  • A general "law of least effort" applies to cognitive as well as physical exertion. The law asserts that if there are several ways of achieving the same goal, people will eventually gravitate to the least demanding course of action
  • System 2 is the only one that can follow rules, compare objects on several attributes, and make deliberate choices between options
  • System 1 detects simple relations and excels at integrating information about one thing, but it does not deal with multiple distinct topics at once, nor is it adept at using purely statistical information
  • A crucial capability of System 2 is the adoption of "task sets": it can program memory to obey an instruction that overrides habitual responses
  • Psychologists speak of "executive control" to describe the adoption and termination of task sets, and neuroscientists have identified the main regions of the brain that serve the executive function. One of these regions is involved whenever a conflict must be resolved. Another is the prefrontal area of the brain, a region that is substantially more developed in humans than in other primates, and is involved in operations that we associate with intelligence
  • Switching from one task to another is effortful, especially under time pressure
  • The ability to control attention is not simply a measure of intelligence
  • We normally avoid mental overload by dividing our tasks into multiple easy steps, committing intermediate results to long-term memory or to paper rather than to an easily overloaded working memory
  • We cover long distances by taking our time and conduct our mental lives by the law of least effort

  • I suspect that frequent switching of tasks and speeded-up mental work are not intrinsically pleasurable, and that people avoid them when possible

  • Several psychological studies have shown that people who are simultaneously challenged by a demanding cognitive task and by a temptation are more likely to yield to the temptation
  • People who are cognitively busy are also more likely to make selfish choices, use sexist language, and make superficial judgments in social situations
  • Too much concern about how well one is doing in a task sometimes disrupts performance by loading short-term memory with pointless anxious thoughts
  • When you are actively involved in difficult cognitive reasoning or engaged in a task that requires self-control, your blood glucose level drops
  • The bold implication of this idea is that the effects of ego depletion could be undone by ingesting glucose
  • Restoring the level of available sugar in the brain had prevented the deterioration of performance

  • When people believe a conclusion is true, they are also very likely to believe arguments that appear to support it, even when these arguments are unsound
  • Intelligence is not only the ability to reason; it is also the ability to find relevant material in memory and to deploy attention when needed

  • If people were ranked by their self-control and by their cognitive aptitude, would individuals have similar positions in the two rankings?
  • The resisters had higher measures of executive control in cognitive tasks, and especially the ability to reallocate their attention effectively. As young adults, they were less likely to take drugs. A significant difference in intellectual aptitude emerged: the children who had shown more self-control as four-year-olds had substantially higher scores on tests of intelligence
  • There is a close connection between the children's ability to control their attention and their ability to control their emotions
  • System 1 is impulsive and intuitive; System 2 is capable of reasoning and it is cautious, but at least for some people it is also lazy
  • What makes some people more susceptible than others to biases of judgment?
  • Two parts of System 2
    • One of these minds (he calls it algorithmic) deals with slow thinking and demanding computation
    • The ability of rationality (superficial or "lazy" thinking being a failure of rationality)

  • System 1 provides the impressions that often turn into your beliefs, and is the source of the impulses that often become your choices and your actions

  • Cognitive strain is affected by both the current level of effort and the presence of unmet demands
  • Your experience cognitive strain when you read instructions in a poor font, or in faint colors, or worded in complicated language, or when you are in a bad mood, and even when you frown

  • Words that you have seen before become easier to see again - you can identify them better than other words when they are shown very briefly or masked by noise, and you will be quicker to read them than to read other words. In short, you experience greater cognitive ease in perceiving a word you have seen earlier, and it is this sense of ease that gives you the impression of familiarity

  • A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth
  • You do not have to repeat the entire statement of a fact or idea to make it appear true

  • Couching familiar ideas in pretentious language is taken as a sign of poor intelligence and low credibility
  • Try to make it memorable
  • If you quote a source, choose one with a name that is easy to pronounce

  • The experience of cognitive strain, whatever its source, tends to mobilize System 2, shifting people's approach to problems from a casual intuitive mode to a more engaged and analytic mode

  • It appears to be a feature of System 1 that cognitive ease is associated with good feelings
  • Zajonc argued that the effect of repetition on liking is a profoundly important biological fact, and that it extends to all animals
    • To survive in a frequently dangerous world, an organism should react cautiously to a novel stimulus, with withdrawal and fear. Survival prospects are poor for an animal that is not suspicious of novelty. However, it is also adaptive for the initial caution to fade if the stimulus is actually safe. The mere exposure effect occurs, Zajonc claimed, because the repeated exposure of a stimulus is followed by nothing bad. Such a stimulus will eventually become a safety signal, and safety is good

  • Mood evidently affects the operation of System 1: when we are uncomfortable and unhappy, we lose touch with our intuition
  • Good mood, intuition, creativity, gullibility, and increased reliance on System 1 form a cluster
  • At the other pole, sadness, vigilance, suspicion, an analytic approach, and increased effort also go together
  • A happy mood loosens the control of System 2 over performance: when in a good mood, people become more intuitive and more creative but also less vigilant and more prone to logical errors

  • The main function of System 1 is to maintain and update a model of your personal world, which represents what is normal in it

  • The commonly accepted wisdom was that we infer physical causality from repeated observations of correlations among events
  • Michotte argued that we see causality, just as directly as we see color
  • We are born prepared to make intentional attributions: infants under one year old identify bullies and victims, and expect a pursuer to follow the most direct path in attempting to catch whatever it is chasing

  • Jumping to conclusions is efficient if the conclusions are likely to be correct and the costs of an occasional mistake acceptable, and if the jump saves much time and effort
  • Jumping to conclusions is risky when the situation is unfamiliar, the stakes are high, and there is no time to collect more information

  • When uncertain, System 1 bets on an answer, and the bets are guided by experience. The rules of the betting are intelligent: recent events and the current context have the most weight in determining an interpretation. When no recent event comes to mind, more distant memories govern
  • System 1 does not keep track of alternatives that it rejects, or even of the fact that there were alternatives
  • Conscious doubt is not in the repertoire of System 1; it requires maintaining incompatible interpretations in mind at the same time, which demands mental effort. Uncertainty and doubt are the domain of System 2

  • When System 2 is otherwise engaged, we will believe almost anything
  • System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy

  • The tendency to like (or dislike) everything about a person - including things you have not observed - is known as the halo effect
  • What do you think of Alan and Ben?
    • Alan: intelligent—industrious—impulsive—critical—stubborn—envious
    • Ben: envious—stubborn—critical—impulsive—industrious—intelligent
    • The initial traits in the list change the very meaning of the traits that appear later
  • The sequence in which we observe characteristics of a person is often determined by chance. Sequence matters, however, because the halo effect increases the weight of first impressions, sometimes to the point that subsequent information is mostly wasted
  • To derive the most useful information from multiple sources of evidence, you should always try to make these sources independent of each other

  • System 1 is radically insensitive to both the quality and the quantity of the information that gives rise to impressions and intuitions
  • WYSIATI facilitates the achievement of coherence and of the cognitive ease that causes us to accept a statement as true
  • Overconfidence: We often fail to allow for the possibility that evidence that should be critical to our judgment is missing
  • Framing effects: Different ways of presenting the same information often evoke different emotions

  • Because System 1 represents categories by a prototype or a set of typical examplars, it deals well with averages but poorly with sums

  • An underlying scale of intensity allows matching across diverse dimensions

  • I propose a simple account of how we generate intuitive opinions on complex matters. If a satisfactory answer to a hard question is not found quickly, System 1 will find a related question that is easier and will answer it. I call the operation of answering one question in place of another substitution
  • Substituting one question for another can be a good strategy for solving difficult problems, and George Polya included substitution in his classic How to Solve It: "If you can't solve a problem, then there is an easier problem you can solve: find it."
  • Target question -> Heuristic question
    • How happy are you with your life these days? -> What is my mood right now?

  • If asked a question in relation to happiness before asking the level of happiness of someone, the first question answer's will bias the second question's answer
  • Satisfaction in the particular domain dominates happiness reports

  • Self-criticism is one of the functions of System 2

  • generates impressions, feelings, and inclinations; when endorsed by System 2 these become beliefs, attitudes, and intentions
  • operates automatically and quickly, with little or no effort, and no sense of voluntary control
  • can be programmed by System 2 to mobilize attention when a particular pattern is detected (search)
  • executes skilled responses and generates skilled intuitions, after adequate training
  • creates a coherent pattern of activated ideas in associative memory
  • links a sense of cognitive ease to illusions of truth, pleasant feelings, and reduced vigilance
  • distinguishes the surprising from the normal
  • infers and invents causes and intentions
  • neglects ambiguity and suppresses doubt
  • is biased to believe and confirm
  • exaggerates emotional consistency (halo effect)
  • focuses on existing evidence and ignores absent evidence (WYSIATI)
  • generates a limited set of basic assessments
  • represents sets by norms and prototypes, does not integrate
  • matches intensities across scales (e.g., size to loudness)
  • computes more than intended (mental shotgun)
  • sometimes substitutes an easier question for a difficult one (heuristics)
  • is more sensitive to changes than to states (prospect theory)
  • overweights low probabilities
  • shows diminishing sensitivity to quantity (psychophysics)
  • responds more strongly to losses than to gains (loss aversion)
  • frames decision problems narrowly, in isolation from one another

  • Intuitions about random sampling appear to satisfy the law of small numbers, which asserts that the law of large numbers applies to small numbers as well

  • Can your System 1 distinguish degrees of belief? The principle of WYSIATI suggests that it cannot

  • Our predilection for causal thinking exposes us to serious mistakes in evaluating the randomness of truly random events
  • The tendency to see patterns in randomness is overwhelming

  • Anchoring effect: when people consider a particular value for an unknown quantity before estimating that quantity
  • Any number that you are asked to consider as a possible solution to an estimation problem will induce an anchoring effect
  • Two different mechanisms produce anchoring effects - on for each system
    • System 1: priming effect
    • System 2: deliberate process of adjustment
  • Anchors that are obviously random can be just as effective as potentially informative anchors
  • In many games, moving first is an advantage in single-issue negotiations
  • If you think the other side has made an outrageous proposal, you should not come back with an equally outrageous counteroffer, creating a gap that will be difficult to bridge in further negotiations. Instead you should make a scene, storm out or threaten to do so, and make it clear - to yourself as well as to the other side - that you will not continue the negotiation with that number on the table
  • The main moral of priming research is that our thoughts and our behavior are influenced, much more than we know or want, by the environment of the moment
  • You should assume that any number that is on the table has had an anchoring effect on you, and if the stakes are high you should mobilize yourself (your System 2) to combat the effect

  • What people actually do when they wish to estimate the frequency of a category
    • Instances of the class will be retrieved from memory, and if retrieval is easy and fluent, the category will be judged to be large
  • The availability heuristic is the process of judging frequency by "the ease with which instances come to mind"
  • The availability heuristic, like other heuristics of judgment, substitutes one question for another: you wish to estimate the size of a category or the frequency of an event, but you report an impression of the ease with which instances come to mind

  • How will people's impressions of the frequency of a category be affected by a requirement to list a specified number of instances?
  • Self-ratings were dominated by the ease with which examples had come to mind. The experience of fluent retrieval of instances trumped the number retrieved
  • When people are instructed to frown while doing a task, they actually try harder and experience greater cognitive strain
  • Judgments are no longer influenced by ease of retrieval when the experience of fluency is given a spurious explanation by the presence of curved or straight text boxes, by the background color of the screen, or by other irrelevant factors that the experimenters dreamed up
  • Among the basic features of System 1 is its ability to set expectations and to be surprised when these expectations are violated
  • System 2 can reset the expectations of System 1 on the fly, so that an event that would normally be surprising is now almost normal

  • Our expectations about the frequency of events are distorted by the prevalence and emotional intensity of the messages to which we are exposed
  • People often form opinions and make choices that directly express their feelings and their basic tendency to approach or avoid, often without knowing that they are doing so
  • Damasio and his colleagues have observed that people who do not display the appropriate emotions before they decide, sometimes because of brain damage, also have an impaired ability to make good decisions

  • A picture of Mr. and Ms. Citizen that is far from flattering: guided by emotion rather than by reason, easily swayed by trivial details, and inadequately sensitive to differences between low and negligibly low probabilities
  • Experts show many of the same biases as the rest of us in attenuated form, but often their judgments and preferences about risks diverge from those of other people
  • Experts often measure risks by the number of lives (or life-years) lost, while the public draws finer distinctions, for example between "good deaths" and "bad deaths", or between random accidental fatalities and deaths that occur in the course of voluntary activities such as skiing

  • Judging probability by representativeness has important virtues: the intuitive impressions that it produces are often more accurate than chance guesses would be
  • One sin of representativeness is an excessive willingness to predict the occurrence of unlikely (low base-rate) events
  • When an incorrect intuitive judgment is made, System 1 and System 2 should both be indicted. System 1 suggested the incorrect intuition, and System 2 endorsed it and expressed it in a judgment
  • There are two possible reasons for the failure of System 2: ignorance or laziness
  • The second sin of representativeness is insensitivity to the quality of evidence
  • There is one thing you can do when you have doubts about the quality of the evidence: let your judgments of probability stay close to the base rate

  • There are two ideas to keep in mind about Bayesian reasoning and how we tend to mess it up
    • Base rates matter, even in the presence of evidence about the case at hand
    • Intuitive impressions of the diagnosticity of evidence are often exaggerated
  • The essential keys to disciplined Bayesian reasoning can be simply summarized
    • Anchor your judgment of the probability of an outcome on a plausible base rate
    • Question the diagnosticity of your evidence

  • When you specify a possible event in greater detail you can only lower its probability
  • Conjunction fallacy: when people judge a conjunction of two events to be more probable than one of the events in a direct comparison

  • A question phrased as "how many?" makes you think of individuals, but the same question phrased as "what percentage?" does not
  • If you visit a courtroom you will observe that lawyers apply two styles of criticism:
    • to demolish a case they raise doubts about the strongest arguments that favor it
    • to discredit a witness, they focus on the weakest part of the testimony

  • Statistical base rates are generally underweighted, and sometimes neglected altogether, when specific information about the case at hand is available
  • Causal base rates are treated as information about individual case and are easily combined with other case-specific information

  • Nisbett and Borgida found that when they presented their students with a surprising statistical fact, the students managed to learn nothing at all. But when the students were surprised by individual cases - two nice people who had not helped - they immediately made the generalization and inferred that helping is more difficult than they had thought
    • Subjects' unwillingness to deduce the particular from the general was matched only by their willingness to infer the general from the particular

  • Rewards for improved performance work better than punishment of mistakes
  • We tend to be nice to other people when they please us and nasty when they do not, we are statistically punished for being nice and rewarded for being nasty

  • If you expect your predictions to be of modest validity, you will never guess an outcome that is either rare or far from the mean

  • The explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen

  • For some of our most important beliefs we have no evidence at all, except that people we love and trust hold these beliefs

  • The person who acquires more knowledge develops an enhanced illusion of her skill and becomes unrealistically overconfident

  • There are few circumstances under which it is a good idea to substitute judgment for a formula
  • To maximize predictive accuracy, final decisions should be left to formulas, especially in low-validity environments

  • Recognition-primed decision
    • A tentative plan comes to mind by an automatic function of associative memory (System 1)
    • A tentative plan is mentally simulated to check if it will work (System 2)

  • The confidence that people have in their intuitions is not a reliable guide to their validity. In other words, do not trust anyone - including yourself - to tell you how much you should trust their judgment
  • When both these conditions are satisfied, intuitions are likely to be skilled
    • an environment that is sufficiently regular to be predictable
    • an opportunity to learn these regularities through prolonged practice
  • It is wrong to blame anyone for failing to forecast accurately in an unpredictable world

  • Short-term anticipation and long-term forecasting are different tasks

  • It is possible to distinguish intuitions that are likely to be bogus
    • If the environment is sufficiently regular and if the judge has had a chance to learn its regularities, the associative machinery will recognize situations and generate quick and accurate predictions and decisions
  • When evaluating expert intuition you should always consider whether there was an adequate opportunity to learn the cues, even in a regular environment

  • The proper way to elicit information from a group is not by starting with a public discussion but by confidentially collecting each person's judgment

  • There are many ways for any plan to fail, and although most of them are too improbable to be anticipated, the likelihood that something will go wrong in a big project is high
  • If the reference class is properly chosen, the outside view will give an indication of where the ballpark is, and it may suggest that the inside-view forecasts are not even close to it
  • People who have information about an individual case rarely feel the need to know the statistics of the class to which the case belongs

  • The authors of unrealistic plans are often driven by the desire to get the plan approved - whether by their superiors or by a client - supported by the knowledge that projects are rarely abandoned unfinished merely because of overruns in costs or completion time. In such cases, the responsibility for avoiding the planning fallacy lies with the decision makers who approve the plan. If they do not recognize the need for an outside view, they commit a planning fallacy

  • The prevalent tendency to underweight or ignore distributional information is perhaps the major source of error in forecasting. Planners should therefore make every effort to frame the forecasting problem so as to facilitate utilizing all the distributional information that is available
  • The forecasting method
    • Identify an appropriate reference class
    • Obtain the statistics of the reference class. Use the statistics to generate a baseline prediction
    • Use specific information about the case to adjust the baseline prediction, if there are particular reasons to expect the optimistic bias to be more or less pronounced in this project than in others of the same type

  • When forecasting the outcomes of risky projects, executives too easily fall victim to the planning fallacy. In its grip, they make decisions based on delusional optimism rather than on a rational weighting of gains, losses, and probabilities. They overestimate benefits and underestimate costs

  • The people who have the greatest influence on the lives of others are likely to be optimistic and overconfident, and to take more risks than they realize

  • One of the benefits of an optimistic temperament is that it encourages persistence in the face of obstacles
  • The financial benefits of self-employment are mediocre: given the same qualifications, people achieve higher average returns by selling their skills to employers than by setting out on their own
  • Psychologists have confirmed that most people genuinely believe that they are superior to most others on most desirable traits - they are willing to bet small amounts of money on these beliefs in the laboratory

  • Inadequate appreciation of the uncertainty of the environment inevitably leads economic agents to take risks they should avoid
  • An unbiased appreciation of uncertainty is a cornerstone of rationality - but it is not what people and organizations want
  • The main benefit of optimism is resilience in the face of setbacks

  • When the organization has almost come to an important decision but has not formally committed itself, Klein proposes gathering for a brief session a group of individuals who are knowledgeable about the decision. The premise of the session is a short speech: "Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster."
  • The premortem has two main advantages:
    • it overcomes the groupthink that affects many teams once a decision appears to have been made
    • it unleashes the imagination of knowledgeable individuals in a much-needed direction

  • We normally speak of changes of income in terms of percentages
  • Prior to Bernoulli, mathematicians had assumed that gambles are assessed by their expected value: a weighted average of the possible outcomes, where each outcome is weighted by its probability
  • Bernoulli observed that most people dislike risk, and if they are offered a choice between a gamble and an amount equal to its expected value they will pick the sure thing
    • In fact a risk-averse decision maker will choose a sure thing that is less than expected value, in effect paying a premium to avoid the uncertainty
  • Because Bernoulli's model lacks the idea of a reference point, expected utility theory does not represent the obvious fact that a good outcome for one is bad for another
  • Theory-induced blindness: once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws

  • Your attitude to risk would not be different if your net worth were higher or lower by a few thousand dollars
  • You also know that your attitude to gains and losses are not derived from your evaluation of your wealth
  • You just like winning and dislike losing - and you almost certainly dislike losing more than you like winning
  • Organisms that treat threats as more urgent than opportunities have a better chance to survive and reproduce

  • In mixed gambles, where both a gain and a loss are possible, loss aversion causes extremely risk-averse choices
  • In bad choices, where a sure loss is compared to a larger loss that is merely probable, diminishing sensitivity causes risk seeking

  • The problem is that regret theories make few striking predictions that would distinguish them from prospect theory, which has the advantage of being simpler

  • Tastes are not fixed; they vary with the reference point
  • The disadvantages of a change loom larger than its advantages, inducing a bias that favors the status quo

  • Owning a good appears to increase its value
  • What distinguishes market transactions from reluctance to sell certain items?
    • Certain things are held "for exchange" while other are held "for use"
  • Selling goods that one would normally use activates regions of the brain that are associated with disgust and pain
  • Buying also activates these areas, but only when the prices are perceived as too high - when you feel that a seller is taking money that exceeds the exchange value
  • Brain recordings indicates that buying at especially low prices is a pleasurable event
  • Participants displayed an endowment effect only if they had physical possession of the good for a while before the possibility of trading it was mentioned

  • The negativity trumps the positive in many ways, and loss aversion is one of many manifestations
  • Bad emotions, bad parents, and bad feedback have more impact than good ones, and bad information is processed more thoroughly than good. The self is more motivated to avoid bad self-definitions than to pursue good ones. Bad impressions and bad stereotypes are quicker to form and more resistant to disconfirmation than good ones
  • The long-term success of a relationship depends far more on avoiding the negative than on seeking the positive
  • Gottman estimated that a stable relationship requires that good interactions outnumber bad interactions by at least 5 to 1
  • The boundary between good and bad is a reference point that changes over time and depends on the immediate circumstances

  • We are driven more strongly to avoid losses than to achieve gains

  • The possibility effect causes highly unlikely outcomes to be weighted disproportionately more than they "deserve"
  • The certainty effect causes outcomes that are almost certain to be given less weight than their probability justifies
  • The decision weights that people assign to outcomes are not identical to the probabilities of these outcomes, contrary to the expectation principle
    • Improbable outcomes are overweighted (possibility effect)
    • Outcomes that are almost certain are underweighted relative to actual certainty (certainty effect)

  • People attach values to gains and losses rather than to wealth
  • The decision weights that they assign to outcomes are different from probabilities
  • Because defeat is so difficult to accept, the losing side in wars often fights long past the point at which the victory of the other side is certain, and only a matter of time

  • The successful execution of a plan is specific and easy to imagine when one tries to forecast the outcome of a project
  • In contrast, the alternative of failure is diffuse, because there are innumerable ways for things to go wrong

  • Adding irrelevant but vivid details to a monetary outcome disrupts calculation
  • Cognitive ease contributes to the certainty effect: when you hold a vivid image of an event, the possibility of its not occurring is also represented vividly, and overweighted

  • The result of many experiments suggest that rare events are not overweighted when we make decisions such as choosing a restaurant or tying down the boiler to reduce earthquake damage
  • Obsessive concerns, vivid images, concrete representations, and explicit reminders all contribute to overweighting

  • Narrow framing: a sequence of two simple decisions, considered separately
  • Broad framing: a single comprehensive decision, with four options
  • Humans are by nature narrow framers
  • Because we are susceptible to WYSIATI and averse to mental effort, we tend to make decisions as problems arise, even when we are specifically instructed to consider them jointly

  • Under some very specific conditions, a utility maximizer who rejects a single gamble should also reject the offer of many

  • Decision makers who are prone to narrow framing construct a preference every time they face a risky choice. They would do better by having a risk policy that they routinely apply whenever a relevant problem arises
  • A risk policy is a broad frame
  • The outside view is a broad frame for thinking about plans
  • A risk policy is a broad frame that embeds a particular risky choice in a set of similar choices
  • The outside view and the risk policy are remedies against two distinct biases that affect many decisions: the exaggerated optimism of the planning fallacy and the exaggerated caution induced by loss aversion

  • We refuse to cut losses when doing so would admit failure, we are biased against actions that could lead to regret, and we draw an illusory but sharp distinction between omission and commission

  • A rational decision maker is interested only in the future consequences of current investments

  • Regret is not the same as blame
  • People expect to have stronger emotional reactions (including regret) to an outcome that is produced by an action that than to the same outcome when it is produced by inaction
  • Consumers who are reminded that they may feel regret as a result of their choices show an increased preference for conventional options, favoring brand names over generics

  • Losses are weighted about twice as much as gains in several contexts
  • The loss-aversion coefficient is much higher in some situations. In particular, you may be more loss averse for aspects of your life that are more important than money, such as health. Furthermore, your reluctance to "sell" important endowments increases dramatically when doing so might make you responsible for an awful outcome
  • Is it reasonable to let your choices be influenced by the anticipation of regret?
  • Daniel Gilbert and his colleagues provocatively claim that people generally anticipate more regret than they will actually experience, because they underestimate the efficacy of the psychological defenses they will deploy - which they label the "psychological immune system". Their recommendation is that you should not put too much weight on regret; even if you have some, it will hurt less than you now think

  • Broader frames and inclusive accounts generally lead to more rational decisions

  • Pea-end rule: The global retrospective rating was well predicted by the average of the level of pain reported at the worst moment of the experience and at its end
  • Duration neglect: The duration of the procedure had no effect whatsoever on the ratings of total pain
  • What we learn from the past is to maximize the qualities of our future memories, not necessarily of our future experience

  • Tastes and decisions are shaped by memories, and the memories can be wrong
  • We have strong preferences about the duration of our experiences of pain and pleasure. We want pain to be brief and pleasure to last. But our memory, a function of System 1, has evolved to represent the most intense moment of an episode of pain or pleasure (the peak) and the feelings when the episode was at its end

  • A story is about significant events and memorable moments, not about time passing
  • In intuitive evaluation of entire lives as well as brief episodes, peaks and ends matter but duration does not

  • The percentage of time that an individual spends in an unpleasant state is called the U-index. For example, an individual who spent for 4 hours of a 16-hour waking day in an unpleasant state would have a U-index of 25%
  • An individual's mood at any moment depends on her temperament and overall happiness, but emotional well-being also fluctuates considerably over the day and the week
  • Our emotional state is largely determined by what we attend to, and we are normally focused on our current activity and immediate environment
  • The feelings associated with different activities suggest that another way to improve experience is to switch time from passive leisure, such as TV watching, to more active forms of leisure, including socializing and exercise
  • It is only a slight exaggeration to say that happiness is the experience of spending time with people you love and who love you
  • The Cantril Self-Anchoring Striving Scale
    • Please imagine a ladder with steps numbered from zero at the bottom to ten at the top. The top of the ladder represents the best possible life for you and the bottom of the ladder represents the worst possible life for you. On which step of the ladder would you say you personally feel you stand at this time?

  • The response to well-being questions should be taken with a grain of salt (because they are biased by your current mood)
  • The score that you quickly assign to your life is determined by a small sample of highly available ideas, not by careful weighting of the domains of your life
  • One reason for the low correlation between individuals' circumstances and their satisfaction with life is that both experienced happiness and life satisfaction are largely determined by the genetics of temperament
  • One recipe for a dissatisfied adulthood is setting goals that are especially difficult to attain

  • Nothing in life is as important as you think it is when you are thinking about it
  • Over time, with few exceptions, attention is withdrawn from a new situation as it becomes more familiar
  • Pain and noise are biologically set to be signals that attract attention, and depression involves a self-reinforcing cycle of miserable thoughts
  • The mistake that people make in the focusing illusion involves attention to selected moments and neglect of what happens at other times
  • The mind is good with stories, but it does not appear to be well designed for the processing of time

  • The investment of attention improves performance in numerous activities
  • The acquisition of skills requires a regular environment, an adequate opportunity to practice, and rapid and unequivocal feedback about the correctness of thoughts and actions
  • In everyday speech, we call people reasonable if it is possible to reason with them, if their beliefs are generally in tune with reality, and if their preferences are in line with their interests and their values
  • The word rational conveys and image of greater deliberation, more calculation, and less warmth, but in common language a rational person is certainly reasonable. For economists and decision theorists, (rational) has a different meaning. The only test of rationality is not whether a person's beliefs and preferences are reasonable, but whether they are internally consistent
    • Rationality is logical coherence - reasonable or not
  • System 1 is not readily educable
  • The way to block errors that originate in System 1 is simple in principle: recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2