Steven Poole

The great intellectual bromance of the last century — between Daniel Kahneman and Amos Tversky

The great intellectual bromance of the last century — between Daniel Kahneman and Amos Tversky
Text settings

It’s the intellectual bromance of the last century. Two psychologists — Danny, a Holocaust kid and adviser to the Israel Defence Forces, and Amos, a former child prodigy and paratrooper — meet at the end of the 1960s, and sparks immediately begin to fly. They spend countless hours locked in rooms together at Hebrew University and elsewhere, and eventually co-write a series of papers that will revolutionise the field, and lead to the surviving partner being awarded the Nobel prize in economics. Not, however, before this extraordinary partnership has itself fallen apart, like a love affair, in regret and mutual recrimination.

Our heroes are Daniel Kahneman and Amos Tversky, celebrated for their identification of the ways in which our reasoning can go wrong because of cognitive biases. For instance, we tend to imagine things are more likely if they are more easily available to our memory, as with, say, recent terrorist attacks: this is called the ‘availability heuristic’, and there are many others.

This work led to the christening of a new subdiscipline called ‘behavioural economics’, and from there to the idea of nudge politics, in which our benign, enlightened masters cleverly rig the choices available to citizens, so more of them will choose the right ones. (For example, if you make pension saving an opt-out rather than opt-in choice, then most people’s ‘status quo bias’, or laziness, will ensure that more people save.) Few psychological theories have

had such widespread concrete influence in the world.

Michael Lewis, the author of the classic Wall Street memoir Liar’s Poker and the financial-crisis nonfiction thriller The Big Short, is a brilliant writer, and this is in most respects a wonderful book, particularly in the vivid and readable way it dramatises the two men’s thought processes, and the singular dynamic of the collaboration it portrays between two men who are undoubtedly geniuses, but of very different sorts. Tversky comes across as a super-confident firework; Kahneman as melancholy and beset by self-doubt. Non-American readers may struggle through the beginning, as it is all about unhelpful biases that infect coaches’ decisions in choosing baseball players. (Perhaps this stuff was left over from Lewis’s book on the sport, Moneyball.)

But soon we are into the fascinating story of Kahneman’s work for the Israeli army and air force, and the pace and interest never let up. It is impressive how Lewis teases suspense and surprise out of so potentially dry a subject: the book is a masterclass in narrative nonfiction. In one quite important respect, however, it is unreliable — and that is in its simplistically partisan representation of its heroes’ work.

Over the years, Kahneman and Tversky showed elegantly that humans do not always act ‘rationally’ in the sense that economic theory predicts they will. They fail to make what seem to be the correct calculations in order to maximise their self-interest; their preferences are not always mathematically consistent. According to economic theory, for example, you should view a bet in which you could win £100 or lose £100 with perfect equanimity. But for actual human beings, losses loom larger. We are ‘loss averse’, which is economically ‘irrational’ but of course makes perfect sense, especially for people who have little money to begin with and so have more to lose if some is taken away.

What is at issue, however, is the inference about human nature we should draw from these apparent failures. Partisans of Kahneman and Tversky often speak as though such departures from economic rationality mean that we are somehow fundamentally irrational, but their critics argued all along that it just means that the economic definition of rationality does not encompass everything that ought to deserve that description. This enduring disagreement in psychology is most vividly illustrated by the notorious ‘Linda problem’, which goes as follows. Imagine you are told the following about Linda:

Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice and also participated in anti-nuclear demonstrations.

Now, you are asked, which of these statements is more probable?

1. Linda is a bank teller.

2. Linda is a bank teller and is active in the feminist movement.

A majority of subjects, it turns out, answer that 2 is more probable: that Linda is a bank teller and an active feminist. The problem is that this must be false arithmetically, because there are many more bank tellers (the feminist ones plus all the rest) than there are feminist bank tellers. If you say Linda is a feminist bank teller, you are saying that it’s more probable she belongs to a smaller population than to a larger one. Mathematically, this is just a mistake. Kahneman and Tversky concluded that we are primed by the irrelevant information about Linda’s personality to commit what they called the ‘conjunction fallacy’. The moral, Michael Lewis now writes, is that ‘People were blind to logic when it was embedded in a story.’

Well, not so fast. Other psychologists at the time quickly pointed out a host of reasons to avoid concluding that the most common answer was irrational. It turns out, for example, that if you pose the Linda question but ask explicitly about relative frequencies rather than ‘probability’, then people suddenly give the right answer much more often. In which case it looks as though Kahneman’s and Tversky’s subjects are simply interpreting the word ‘probable’ as meaning something like ‘narratively plausible’. And indeed the mistake (answer 2) is more narratively plausible: it makes sense that Linda would be a bank teller and an active feminist. All the available information is now pertinent and consistent. (Otherwise, a subject might wonder, why are they telling me the initial story about her at all?)

Unfortunately, the reader will not glean any of this from Lewis’s book, which chooses to portray one of these critics, the psychologist Gerd Gigerenzer, as essentially a rude and obsessive lone eccentric. In fact, he and many other psychologists conducted their debates about many of the supposed fallacies with Kahneman and Tversky over a long period, and with no less scholarly rigour than in any other academic controversy. (At one point, indeed, Kahneman and Tversky made a subtle terminological concession to Gigerenzer by renaming the ‘conjunction fallacy’ the ‘conjunction effect’.) Among psychologists and particularly mental logicians (psychologists who seek to understand the rules of reasoning we actually use), this is still a live issue.

But not, unfortunately, in the mainstream literature of magazine articles and trade books. It’s not Kahneman and Tversky’s fault, exactly, but it is a baleful consequence of their work and its subsequent simplified popularisations (which, like Lewis’s, give little weight to the prominent academic critiques) that the imprimatur of science now supposedly underwrites an intellectual climate in which human beings gleefully announce that human beings are fundamentally irrational. (Kahneman himself is careful not quite to say exactly this in his own bestselling account of their work, Thinking, Fast and Slow.)

It should be obvious, on the contrary, that if people were irredeemably irrational in this way, then a scholarly project that investigates the ways in which our reasoning is sometimes led astray would itself be impossible. But Lewis’s book, while it is a brilliant and moving biography of two remarkable men, also continues this unfortunate tradition. Isn’t it time the rational animal abandoned his peculiar modern habit of rational self-denigration?

Steven Poole’s latest book is Rethink: The Surprising History of New Ideas.