Skip to content

Book Review: Thinking Fast and Slow

Daniel Kahneman won a Nobel Prize in Economics for his work in behavioral economics, much of it with his colleague Amos Tversky, who died in 2006. Kahneman’s 2011 classic, Thinking Fast and Slow, is a superbly-written non-technical summary of their fascinating research and its often counter-intuitive findings.

The best feature of the book is the appearance, every few pages, of a puzzle-like choice for the reader. For example, in a discussion of utility theory, the following choices are presented to the reader:

Situation 1: Which would you choose?

(A) a 61% chance of winning $520,000, or

(B) a 63% chance of winning $500,000?

Situation 2: Which would you choose?

(A) a 98% chance of winning $520,000, or

(B) a 100% chance of winning $500,000?

In each case, (B) offers a 2% boost in the probability of winning, but the expected value is higher for choice A in each case. However, in situation 2, nearly everyone chooses (B).

For many years, economists founded their theories on the precept that humans are rational beings, and act so as to maximize their utility. Utility is a somewhat nebulous concept that reflects the fact that money means different things to different people, and that there is more to life than money. Utility is somewhat difficult to reduce to numbers, so in most cases of actual application, behavioral economists speak of money and its expected value. The above puzzle is one of a number of challenges to utility theory, and it arose out of the field of experimental economics. Results illustrate the certainty effect, showing that, when considering a 2% boost in the probability of winning, people valued the incremental 2% more when it moved you from uncertainty (albeit slight) to certainty, than when it merely boosted your chances slightly in a middling range.

The chatty introduction of puzzles like this make the book a relatively easy and fun read; certainly much more fun than an abstract discussion of utility theory versus decision theory versus prospect theory (prospect theory being Kahneman’s and Tversky’s modification of the simpler utility theory by integrating psychological elements dictated by experimental results such as the above).

Another such psychological element is the framing effect, which is studied carefully by marketers, who have long been aware that different wordings of the same proposition can produce different effects. A classic example is opt-in versus opt-out in organ donation, by checking a box on driver’s license applications. In countries where individuals must check the box to NOT donate, over 80% choose to donate. In countries where individuals must check the box TO donate, fewer than 20% do.

This might be coupled with the loss aversion effect, discussed elsewhere in this briefing, so that a company might frame a discount offer as “Your $100 credit expires Friday” rather than “$100 discount available through Friday”. Psychologically, the urge to avoid losing something that you have is more powerful than the urge to gain something new of equivalent value. Naked Wines, the internet marketer of wines, uses the loss aversion effect when it places a (free) bottle of wine in your shopping cart every month, and prompts you to complete your purchase with more wines, to avoid losing your free bottle.

Kahneman also discusses the anchoring effect, in which the presentation of an irrelevant number has an effect on a decision. He describes an experiment (which I mentioned in an earlier article) where judges’ sentences in experimental cases depended significantly on the throw of a pair of dice beforehand. Marketers make widespread use of the anchoring effect – for example, in price negotiation situations where an artificially inflated initial price orients the user to paying more, notwithstanding their cognitive understanding of the true value.

The fast and slow in the title of the book refers to two mental systems. The fast, intuitive system is based on rapid emotional response, typically bypassing complex thinking. The slow, cognitive system is engaged for more complicated decisions when time allows. However, the lesson of the book is that the fast system often intrudes in non-cognitive ways, even when the slow system is fully engaged in a problem.