I recently read Thinking, Fast and Slow for the first time, one of the go-to books on human biases. If you’re unfamiliar, there are many things the human brain does that are both automatic and unconscious. You can’t look at “2+2=” and not have “4” pop to the forefront of your thoughts—it’s automatic. Most of these unconscious processes are there for our own good thanks to generations of evolution, but sometimes these innate biases can get in our way. Being aware of these biases is the first step to being able to recognize and overcome them. Thinking, Fast and Slow dives into many biases that affect us, but I want to talk about the ones that relate most to investing.
This is how our brain takes one trait of a person and applies it to many other traits that we haven’t observed. An attractive person is more likely to be viewed as smart, creative and a good leader, even though all we know about that person is their physical appearance. The halo effect is also why first impressions are so important. The first few traits we learn about a person can easily overshadow what we subsequently learn about them. We have intuitive opinions about almost everything we come in contact with. It’s common for us to like or dislike someone we just met without even knowing why.
This is one of the most important biases to remember when meeting with management teams. The halo effect can easily trick our mind into liking or disliking someone more than we should. Meeting with a tall, well-dressed CEO? You’re more likely to give him the benefit of the doubt when later discussing his capital allocation decisions.
Humans don’t like difficult questions; our brains prefer being lazy. When a difficult question is asked, often our brain will subconsciously substitute the answer to an easier question. If you are asking yourself about a CEO’s capital allocation skills, you may subconsciously refer back to those positive first impressions. The question you actually end up answering is “Do I like the CEO?” as opposed to “Is he a good capital allocator?”
This relates to how plans and forecasts (and company valuations) are unrealistically close to best case scenarios. One of the reasons for this is how bad our brains are at dealing with small risks—often we ignore them altogether. There are many ways for any investment to not go as planned, and although most of them are too improbable to be anticipated (unknown unknowns), the likelihood that something will go wrong is high. This is one of the many reasons why a large margin of safety is crucial—so many estimates are best case scenarios that don’t factor in random things going wrong. Errors of prediction are inevitable because the world is unpredictable.
I think one of the best ways to fight this planning fallacy is to write a pre-mortem (this idea was introduced to me by Christian Ryther of Curreen Capital). This pre-mortem is now the final question on my investment checklist: “Imagine you purchased this stock today and you are now two years in the future. The stock’s performance has been a disaster. What happened?” It’s one thing to come up with a bear case for each investment, but writing out a pre-mortem really forces me to think about the many small things that may not be factored into my valuation.
People naturally seek evidence that is congruent with their current beliefs. The book gave the example of people answering the questions “Is Sam [or any acquaintance] friendly?” and “Is Sam unfriendly?” very differently. This is because our brains default to trying to confirm the question. These two questions result in the brain retrieving different memories. An essential part of our memory is that it is associative, meaning it retrieves thoughts that are related to the original idea, and then thoughts related to those thoughts, and so on. The book stresses the phrase “what you see is all there is”—information not retrieved from memory might as well not exist.
“What you see it is all there is” also leads us to be overconfident in our predictions (planning fallacy again). Counterintuitively, we have a tendency to be more confident in things we know less about. This is because our brains favor consistent stories more than complete stories and it’s easy to create a consistent story from loosely connected ideas.
I think the main takeaway here is to force yourself to do far more research even after you feel very confident in an investment idea. This is also where a checklist comes in handy. A checklist is the final step in my investment process that I complete before making any purchase. My checklist is 50-60 questions that are designed to make sure I didn’t overlook anything in my research. There have been multiple times that I go through the checklist and realize I missed something. It’s easy to do.
Most people assess the quality of decisions based on outcomes, not the quality of the process (in poker, this is called results-oriented thinking). This outcome bias makes it almost impossible to properly evaluate a decision after the fact. I’ve noticed many investors (including myself) are prone to finding errors in every unsuccessful investment and not seeking errors in profitable investments. Every investment has a bear case and sometimes it comes to fruition through no fault of the investor. Likewise, many bad investments make investors money in the short-term (tech companies in the late 90s).
One thing that gets drilled into every poker player’s head is how much sample size matters. This is much more difficult in investing though. Even active investors are only making a couple hundred decisions per year (concentrated investors like myself may make 10-20 decisions all year). It’s important to analyze past investments to improve yourself, but it’s also important to remember how small of a sample size you’re looking at. If you’ve had 2-3 investments go bad for similar reasons, remember how small of a sample that is. Of course you don’t want to attribute all losses to bad luck, but you also don’t want to overemphasize things that may be due to short-term variance.
This is one that every serious investor is probably aware of. When making estimates that involve numbers, our answers are inevitably anchored (close to)numbers we’ve recently come in contact with. The most amazing thing about the anchoring effect is that it works even if the two numbers are completely unrelated. One example from the book was that participants in a study who gave the last four digits of their social security number just before being asked “How many doctors are in your city?” anchored their answers close to their SSN. The classic case in investing is that looking at stock prices anchors your estimates of intrinsic value to the current price. One of the best ways to counteract these effects are to deliberately think the opposite. “Invert, always invert.”
In situations where both a gain and loss are possible, people act extremely risk-averse. In bad situations, where a loss is almost certain, people are overly willing to risk even greater losses. This is why people who own a declining stock are more willing to average down or ride it out for a chance to get back to even vs selling it and moving on. The book put it like this: “I could close the XYZ account and score a success for my record as an investor or I could close ABC and add a failure to my record. Which would I rather do?”
This is an instance of narrow framing where decisions are considered in a vacuum. Broad framing considers each decision as part of a large group of all decisions you make in your lifetime. Broad framing makes you more likely to think in terms of expected value and not focus on short-term risk aversion. It doesn’t matter whether a stock is up or down when you make a sell decision. A comprehensive view helps you to sell the stock least likely to do well in the future, without considering whether it is a winner or a loser.
Loss aversion also shows that humans experience about twice as much pain from losses as we do pleasure from gains. Closely following daily fluctuations in the stock market is a losing proposition because the pain of the frequent small losses exceeds the pleasure of the equally frequent small gains. The natural reaction to this frequent bad news is increased loss aversion.
Sunk cost fallacy
Related to loss aversion, this is the decision to invest more money in a declining stock when better investments are available elsewhere. The book made an interesting observation here that I’ve never thought of. Consider a CEO who is thinking about sinking more money into a failing project. Throwing good money after bad may be an error from the view of an outside investor, but think about the CEO’s incentives. He can either cancel the bad project and leave a permanent stain on his record or he can invest more money, giving him a chance to save the project and his record. At the very least, this delays the day of reckoning. This problem leads support to bringing in new management during bad times. It’s not necessarily that the new team is expected to be smarter or more skilled, it’s that they won’t have the same mental commitments to failing projects.
Thinking, Fast and Slow ends with the point that human biases are impossible to get rid of. The only thing you can do is consciously be aware of them. Maybe this way you’ll notice when a question is worded in a certain way that will affect your answer, or you’ll take mental note of numbers that could anchor your estimates. The author of the book, Daniel Kahneman, put it like this: “Except for some affects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues. I’ve improved only in my ability to recognize situations in which errors are likely.”
Final fun facts
Remember how our brains don’t like difficult questions? For similar reasons, we prefer simple language that is easy to process. Companies with easily pronounceable names (something like Apple, not Valeant) do better for the first week after IPOing, though the effect disappears over time. Likewise, stocks with pronounceable tickers (such as HOS) outperform others (like NVGS) and the small advantage appears to remain over time.
Finally, many investors have pointed out the survivorship bias (and hindsight bias) in books like The Outsiders. As enjoyable as that book was, I’m not sure there’s as much to learn from it as I originally thought. Interestingly, Thinking, Fast and Slow shows that after Built to Last was published (same basic idea as The Outsiders), the gap in corporate profitability and stock returns between the successful and less successful companies shrank to almost nothing. Kahneman attributed much of this to mean reversion—companies that outperform for so many years probably had above average luck.