“You don't have to be the Emperor of Japan to get fun out of rationality. If you can avoid a lot of hopeless messes and you can help other people (avoid) a lot of their messes, you can be a very constructive citizen. If you're always rational. Being rational means that you avoid certain things, it's like 'I don't want to go where I'm going to die.' I don't want to go where the standard result is awful.”
Most of us would agree with Charlie Munger (Trades, Portfolio) on the importance of being rational in life, yet few of us can actually act rational most of the time, particularly when it comes to making investment decisions. Today I’d like to share my struggles in trying to improve rationality. Hopefully by putting it together and shouting it out, I’ll pound it in.
As I look back, I’ve realized that I’ve gone through a few what I call “stages of rationality.” The first stage obviously is the first two to three years of my value investing journey. During stage one, my inability to act rational is mainly due to ignorance combined with a ridiculous amount of overconfidence. But the interesting thing is when I re-read some of my early journals, I’ve found that I actually thought I was very rational and logical back then. It was just that I got most of the facts and logic wrong, which means what I thought was rational and logical was the exact opposite – irrational and illogical. This is a stage most of us will go through. As the saying goes, “When the student is ready, the teacher will appear.” Very often that teacher is “Mr. Market.”
During the second stage, and you may find it interesting, as I reflect upon my days as a professional analyst, I’ve just realized that the environment and incentives are sub-optimal as far as staying rational. A security analyst’s job at a fund is to, well, analyze securities rationally and recommend ideas to the decision makers. But in many funds, the incentive system actually discourages rationality. Why? Because in most funds, an analyst’s bonus is based on how many winners he or she generates, and there’s usually no claw-back or significant downside risk on losers.
Imagine a fund with, say, eight to 10 analysts, and there is a limited amount of ideas that can make their way into the portfolios (say 25 to 30). What is the incentive for the analyst to be rational and conservative? What’s the consequence of overestimating the worst case and underestimate the bull case (be conservative)? Well, it means your ideas most likely won’t be picked because the numbers are not as compelling as the analyst who underestimate the worst case and overestimate the bull case. The incentive system is often set up to encourage over-optimism and discourage conservatism. It’s a pernicious environment.
And worse yet, most analysts have selective memories – they only remember the times when their aggressiveness is rewarded and they make sure to remind their bosses of the times when they are right. The outcome is that security analysts, as a group, are much less rational than they think they are. Myself included.
The third stage, which is where I hope I’ll achieve, is when one can develop a system of feedback loops and confirming/disconfirming evidences to act rational. One project that I have taken on recently is to re-examine my past investment decisions, especially on worst case assumptions, so that I can objectively evaluate my own decisions. So far, it has proven to be a humbling experience. It’s embarrassing to list them one by one, but here are some examples:
Current working examples are my investment in Allergan plc (AGN, Financial) and Celgene (CELG, Financial). For Allergan plc, my estimate worst case was about $160 to $164 a share. The disconfirming evidence is already here (Allergan hit $144 today). For Celgene, my estimated worst case at the time of the investment was between $80 and $84 a share. We’ll see how much I’m wrong on Celgene.
Of course there are times when my worst cases were more conservative than actuality, such was the case of Nestle (NSRGY, Financial) and Church and Dwight (CHD, Financial). But most of the time, I was more optimistic on the worst case.
The lesson is, with the dataset, I can prove that I’m an idiot who has a penchant to underestimate the worst case. I can’t argue with the data. But what I can learn from objectively looking at the data set is that most of the time I have underestimated the worst case by 15-30%. By looking at the cold facts, I get a sense of my track record.
The other point I want to make is the exercise is not meant to be precise – I can’t say, "Well, from now on I’ll just cut my worst case estimate by 25% and whatever I get would be the answer." There are a lot of factors affecting estimating worst case and there is a good amount of randomness. The point is, the track record is a starting point and can only be used as a reference. But without the track record, I’m more likely to fool myself. You can build the track record on not only the worst case analysis but also other parts of the analysis such as growth assumptions.
As always, I’m curious to hear your thoughts and ideas on your rationality-improvement system.
“What is the central theme that the people in this room represent? I’d argue that it’s rationality rather than to make more money than other people. I’d argue that rationality is a high moral duty. It’s the idea the binds us all together. I think that is a really good idea. It requires that you avoid taking in a lot of the nonsense that’s conventional in your time. There’s always a lot of nonsense in anyone’s time. It requires gradually developing systems of thought that improve your batting average and thinking correctly.”
Disclosure: Long AGN, CELG, CFX, DISCK, TRIP.