More Than You Know: Experts and the Issues They Confront

To be an effective expert, you need to know a little bit about a lot of things, not a lot about only one thing

Author's Avatar
Dec 16, 2019
Article's Main Image

What should we think about experts, especially those in the financial arena?

In chapter six of his book "More Than You Know: Finding Financial Wisdom in Unconventional Places," Michael Mauboussin observed that we tend to think highly of experts, even when their records are not that strong.

He used the example of a 1996 competition that saw a leading cardiologist go against a computer loaded with an artificial intelligence program. Both the doctor and the computer examined some 10,000 EKGs, trying to determine which showed real heart attacks and which did not (clinical records and follow-ups presumably determined whether the EKGs were correct or not). The doctor, an expert, was correct in only 55% of cases, which is little better than flipping a coin. The computer was correct in 66% of cases.

So, what are we to think of that wide range of people referred to as experts? Why do they do so well in some areas and so poorly in others? When we go for heart surgery, we obviously want and need an expert, yet as the anecdote above shows, such experts aren’t always right.

Mauboussin recommended that we start by focusing on thinking tasks and the nature of the problems experts address. The nature of problems can be placed on a continuum. At one end are straightforward issues (“static, linear, and discrete systems”). At the other end are complex issues (“dynamic, non-linear, and continuous”). Mauboussin calls issues at the simple end “rules-based” and those at the complex end “probabilistic”.

He added, “While tens of thousands of hours of deliberate practice allows experts to internalize many of their domain’s features, this practice can also lead to reduced cognitive flexibility. Reduced flexibility leads to deteriorating expert performance as problems go from the simple to the complex.”

A couple of things are happening here. The first is a psychological concept called “functional fixedness,” which occurs when we grow accustomed to thinking of something in one way and thus have difficulty thinking about it in other ways. In other words, we get stuck with a fixed perspective and have trouble understanding any alternatives.

The second is the idea of “reductive bias,” which refers to the fact that humans tend to look at complex problems as if they were simple problems. According to Mauboussin, that means we evaluate systems based on their attributes, not the circumstances (an issue also discussed in a previous chapter). For example, investors may focus on statistically cheap stocks (an attribute), but ignore whether that price signifies value (circumstance).

The solution to functional fixedness and reductive bias is what's called "expert flexibility." According to psychologists, there are two types of flexibility. The first sees experts internalizing many of the “domain’s” circumstances and thus responding to the domain’s contexts and effects. In the second form, experts recognize when their existing models likely won’t work and look beyond them for solutions. Obviously, the second form is essential in addressing complex issues.

Next, Mauboussin turns to rules-based systems where computers consistently outperform humans. Think of computers playing chess, which they can do very well after being programmed with the rules of the game plus rules of thumb that players use. Computers have access to the same rules as humans but are not affected by human biases or emotions. The same may be true for the EKG readings in the earlier example.

When issues are defined by rules, they are relatively simple. Complex issues, on the other hand, are defined by their lack of rules, making computers less useful (although that is changing as artificial intelligence progresses).

As problems become more complex, expert opinions become more important, to a limited extent. In addition, as problems become more complex, they become more probabilistic. When we get to the complex end of the spectrum, Mauboussin reported that collectives tend to outperform the experts and that there is a low level of consensus among the experts. As examples of complexity, he referred to the stock market and the economy.

He also reintroduced the research work of psychologist Phil Tetlock, who asked almost 300 experts to make literally tens of thousands of political and economic predictions over nearly 20 years. As reported in the introduction to "More Than You Know: Finding Financial Wisdom in Unconventional Places," he concluded that the results were unimpressive. “Expert forecasters improved little, if at all, on simple statistical models.” He argued that expertise does not lead to good predictions when the problems are complex.

While the overall results reflected badly on the collective group of experts in the study, some were better than others. Mauboussin wrote:

“What mattered in predictive ability was not who the people were or what they believed, but rather how they thought. Using a metaphor from Archilochus (via Isaiah Berlin), Tetlock segregated the experts into hedgehogs and foxes. Hedgehogs know one big thing and extend the explanatory reach of that thing to everything they encounter. Foxes, in contrast, tend to know a little about a lot and are not wedded to a single explanation for complex problems.

We can say that hedgehogs have one power tool while foxes have many tools in their toolbox. Of course, hedgehogs solve certain problems brilliantly—they certainly get their fifteen minutes of fame—but don’t predict as well over time as the foxes do, especially as conditions change.”

Conclusion

At this point, I believe it would be helpful to remind ourselves that Mauboussin’s goal in writing this book was to encourage us to adopt a multi-disciplinary approach to the way we think about investing and investment decisions.

That comes through in his discussion of experts, in which he describes their limited predictive capability. The limits involve not only the human elements of “functional fixedness” and “reductive bias” but also the types of issues or problems they try to solve.

Finally, he reminds us of the distinction between experts who are more accurate and experts who are less accurate. The more accurate experts tend to be those who know a little about a lot, aka the foxes. Knowing about a lot of things is roughly the same concept as a multi-disciplinary approach, or, as Charlie Munger (Trades, Portfolio) might put it, the multiple mental models.

Read more here:

 Not a Premium Member of GuruFocus? Sign up for a free 7-day trial here.