Michael Lewis Article - The King of Human Error

Author's Avatar
Nov 11, 2011
We’re obviously all at the mercy of forces we only dimly perceive and events over which we have no control, but it’s still unsettling to discover that there are people out there—human beings of whose existence you are totally oblivious—who have effectively toyed with your life. I had that feeling soon after I published Moneyball. The book was ostensibly about a cash-strapped major-league baseball team, the Oakland A’s, whose general manager, Billy Beane, had realized that baseball players were sometimes misunderstood by baseball professionals, and found new and better ways to value them. The book attracted the attention of a pair of Chicago scholars, an economist named Richard Thaler and a law professor named Cass Sunstein (now a senior official in the Obama White House). “Why do professional baseball executives, many of whom have spent their lives in the game, make so many colossal mistakes?” they asked in their review in The New Republic. “They are paid well, and they are specialists. They have every incentive to evaluate talent correctly. So why do they blunder?” My book clearly lacked a satisfying answer to that question. It pointed out that when baseball experts evaluated baseball players their judgment could be clouded by their prejudices and preconceptions—but why? I’d stumbled upon a mystery, the book reviewers noted, and I’d failed not merely to solve it but also to see that others already had done so. As they put it:


Lewis is actually speaking here of a central finding in cognitive psychology. In making judgments, people tend to use the “availability heuristic.” As Daniel Kahneman and Amos Tversky have shown, people often assess the probability of an event by asking whether relevant examples are cognitively “available” [i.e., can be easily remembered]. Thus [because they more readily recall words ending in “ing” than other words with penultimate “n”s, such as “bond” or “mane”], people are likely to think that more words, on a random page, end with the letters “ing” than have “n” as their next to last letter—even though a moment’s reflection will show that this could not possibly be the case. Now, it is not exactly dumb to use the availability heuristic. Sometimes it is the best guide that we possess. Yet reliable statistical evidence will outperform the availability heuristic every time. In using data rather than professional intuitions, Beane confirmed this point.


Kahneman and Tversky were psychologists, without a single minor-league plate appearance between them, but they had found that people, including experts, unwittingly use all sorts of irrelevant criteria in decision-making. I’d never heard of them, though I soon realized that Tversky’s son had been a student in a seminar I’d taught in the late 1990s at the University of California, Berkeley, and while I was busy writing my book about baseball, Kahneman had apparently been busy receiving the Nobel Prize in Economics. And he wasn’t even an economist. (Tversky had died in 1996, making him ineligible to share the prize, which is not awarded posthumously.) I also soon understood how embarrassed I should be by what I had not known.


Between 1971 and 1984, Kahneman and Tversky had published a series of quirky papers exploring the ways human judgment may be distorted when we are making decisions in conditions of uncertainty. When we are trying to guess which 18-year-old baseball prospect would become a big-league all-star, for example. To a reader who is neither psychologist nor economist (i.e., me), these papers are not easy going, though I am told that compared with other academic papers in their field they are high literature. Still, they are not so much written as constructed, block by block. The moment the psychologists uncover some new kink in the human mind, they bestow a strange and forbidding name on it (“the availability heuristic”). In their most cited paper, cryptically titled “Prospect Theory,” they convinced a lot of people that human beings are best understood as being risk-averse when making a decision that offers hope of a gain but risk-seeking when making a decision that will lead to a certain loss. In a stroke they provided a framework to understand all sorts of human behavior that economists, athletic coaches, and other “experts” have trouble explaining: why people who play the lottery also buy insurance; why people are less likely to sell their houses and their stock portfolios in falling markets; why, most sensationally, professional golfers become better putters when they’re trying to save par (avoid losing a stroke) than when they’re trying to make a birdie (and gain a stroke).


When you wander into the work of Kahneman and Tversky far enough, you come to find their fingerprints in places you never imagined even existed. It’s alive in the work of the psychologist Philip Tetlock, who famously studied the predictions of putative political experts and found they were less accurate than predictions made by simple algorithms. It’s present in the writing of Atul Gawande (Better, The Checklist Manifesto), who has shown the dangers of doctors who place too much faith in their intuition. It inspired the work of Terry Odean, a finance professor at U.C. Berkeley, who examined 10,000 individual brokerage accounts to see if stocks the brokers bought outperformed stocks they sold and found that the reverse was true. Recently, The New York Times ran an interesting article about a doctor and medical researcher in Toronto named Donald Redelmeier, whose quirky research projects upended all sorts of assumptions you might not know you even had. He’d shown that changing lanes in traffic is pointless, for instance, and that an applicant was less likely to be admitted to a medical school if he was interviewed on a rainy day. More generally he had demonstrated the power of illusions on the human mind. The person who had sent him down this road in life, he told the Times reporter, was his old professor Amos Tversky.


It didn’t take me long to figure out that, in a not so roundabout way, Kahneman and Tversky had made my baseball story possible. In a collaboration that lasted 15 years and involved an extraordinary number of strange and inventive experiments, they had demonstrated how essentially irrational human beings can be. In 1983—to take just one of dozens of examples—they had created a brief description of an imaginary character they named “Linda.” “Linda is thirty-one years old, single, outspoken, and very bright,” they wrote. “She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.” Then they went around asking people the same question:


Which alternative is more probable?


(1) Linda is a bank teller.


(2) Linda is a bank teller and is active in the feminist movement.


The vast majority—roughly 85 percent—of the people they asked opted for No. 2, even though No. 2 is logically impossible. (If No. 2 is true, so is No. 1.) The human mind is so wedded to stereotypes and so distracted by vivid descriptions that it will seize upon them, even when they defy logic, rather than upon truly relevant facts. Kahneman and Tversky called this logical error the “conjunction fallacy.”


Link to remainder of article: http://www.vanityfair.com/business/features/2011/12/michael-lewis-201112