I’ve read the earlier parts of the book before I started this blog, but I will get to those as well. For now, I’ll be starting from Chapter 11.
Chapter 11: Anchors
- What is it: The tendency to rely too heavily on the first piece of information seen (the “anchor”) when making decisions. This happens regardless of whether the information is random or informative. Individuals subconsciously struggle to adjust from the anchor, sometimes convincing themselves that they are objective when they are not.
- How is it used against us: Anchors can be manipulated to influence our decisions. For example, when a high price is first mentioned, subsequent prices will seem more reasonable. This is why salespeople often start with a high price to make it seem like the item is worth more than it is.
- How to counter it: Assume that any number or statistic presented has an anchoring effect on you and then use System 2 (the more deliberate, logical, and slow-thinking part of our brain) to question the anchor (i.e., is it informative or purely arbitrary?). In negotiations, if presented with an anchor that’s unreasonable, make it known that discussions are off the table until a more reasonable anchor is presented.
- How is it used in investing: If you bought a stock at $100 and it’s now at $50, you might be instinctively tempted to hold onto it until it reaches $100 again. In valuing companies, there can be many price targets out in the market that cause you to make overly optimistic or pessimistic estimates together with the crowd.
Chapter 12: The Science of Availability
- What is it: The availability to which examples of an event comes to mind can influence our perception of the likelihood of that event, even if it’s not backed by data. If we can’t easily recall examples of an event, we tend to underestimate its likelihood (vice versa).
- How is it used against us: The power of media or our social circle is most prominent here. The more a story is covered or an idea discussed, the more available it is to us, and the more likely we are to think it’s common. This can lead to overestimating the likelihood of certain events which might induce irrational fears or biases.
- How to counter it: This bias is largely an effect of System 1. To counter it, we need to engage System 2 to question the availability of the information. Ask yourself if the information is based on a representative sample (of the general population) or if it’s just the most recent or vivid examples that come to mind. If you’re making a decision based on the availability of information, try to seek out more data or examples to get a more accurate picture. Even if we are not able to find a statistically accurate fact using System 2, it can counteract the initial effect of System 1 (as long as we have some sort of rationale).
- How is it used in investing: There are many sensationalist articles or news that can make us think that a certain event is more likely than it is. We don’t want to be slaves to the narrative that the media choose to influence us with. We should be aware of the availability bias and seek out more data to make a more informed decision. This helps us be contrarian when necessary and exploit opportunities that others might be missing.
Chapter 13: Availability, Emotion, and Risk
- Many of us like to believe that we think rationally, but our emotions play a significant role in our decision-making.
- What we hear or see most often (i.e., media coverage) shapes our views and in turn shapes future media coverage. This creates a feedback loop that can lead to irrational fears or biases. In the presence of insufficient information, we tend to seek the easy way out and use heuristics (i.e., how we feel about things) to make decisions or to form our opinions of the issue. This is the affect heuristic, in which we substitute an answer to a hard question (i.e., What do I think about it?) with an easier one (i.e., How do I feel about it?).
- The more we like something (i.e., hear praises about it from our peers or the media) without it being backed by data, the more we overstate its benefits and understate its risks.
The affect heuristic simplifies our lives by creating a world that is much tidier than reality.
- Availability cascade: A self-sustaining chain of events occurs when a group of people learns about a relatively minor issue (small in comparison to more pressing concerns), often through the media. This initial awareness leads to collective worry, which, in turn, draws greater media attention. The increased coverage amplifies public concern and engagement. To capitalize on the trend and attract more viewership, media outlets escalate the sensationalism with exaggerated headlines and stories, perpetuating the cycle. There can also be twisted individuals who delight in ensuring a continuous flow of worrying news regardless of it’s validity. Attempts by rational voices to calm the situation are often met with intense criticism, as the public accuses these voices of harboring hidden agendas (e.g., “If everyone around me believes this, it must be true, and the experts are simply trying to protect their own interests”). Eventually, the public opinion it becomes a huge tidal wave in which it becomes safer to swim with the tide than against it.
- The previous point has a serious consequence on public policies, in which resources are allocated to sensationlist issues over truly pressing concerns that has more benefits for every unit of resources spent. This is because politicians are incentivized to act on issues that are most visible to the public, rather than those that are most important. Good politicians / leaders will do well to balance the perceived concerns of the public with the actual concerns that are most pressing (i.e., as advised by impartial experts).
- How to counter it: The best way to counter the affect heuristic is to be aware of it and to engage System 2 to question our feelings. We should also seek out more data to make a more informed decision.
- How is it used in investing: We have to constantly remind ourselves that our social circles or the echo chambers in which we get company news / information from can thwart our objectivity. To combat this, we should actively seek out contrasting views and data to make a more informed decision. This can help us avoid the herd mentality and exploit opportunities that others might be missing.
Chapter 14: Tom W’s Specialty
- This chapter revolves around the idea of Bayes theorem, which is a mathematical formula for determining the probability of a hypothesis given prior knowledge. It’s a way to update our beliefs based on new information. This prior knowledge / new information that we get has, in itself, a likelihood of being true which we often forget to consider. We also tend to ignore the base rate of the population when making decisions which leads to inaccurate estimates.
- In the face of unproven “evidence” such as “the next door neighbour wears specs and a black hoodie all the time and seems like a geek” will lead to a higher likelihood of the hypothesis that our neighbour is likely a computer geek. It’s interesting how such a baseless judgement influences our probability estimates (i.e., the base rate of a computer scientist should be much lower than the base rate of a person studying business).
- In some situations, stereotypes are indeed true. But without data on how likely the stereotype is true, we can’t make an accurate estimate especially when it causes us to neglect the base rate of the population that leads to a different conclusion.
- How to counter it: Remind ourselves to think like a statistician and forcefully (if it’s not deliberate, System 2 will simply find “facts” to justify System 1’s ideas) activate System 2 to question System 1’s conclusions. Interestingly, “frowning generally increases the vigilance of System 2 and reduces both overconfidence and the reliance on intuition”. If the evidence you have is not trustworthy, the base rates should dominate your estimates. In statistics, we are saying the probability of an event happening given the evidence is equal to the probability of the evidence given the event times the probability of the event divided by the probability of the evidence. This is a way to update our beliefs based on new information.
You should not let yourself believe whatever comes to your mind. To be useful, your beliefs should be constrained by the logic of probability.
- Anchor your judgement of the probability of an outcome on a plausible base rate.
- Question the diagnosticity of your evidence.
- We tend to think “WYSIATI” (What You See Is All There Is) and ignore other possibilities. We are also subject to “associative coherence” in which we tend to make up stories based on what we think we know so that complex situations seem more coherent. Consequently, we end up believing in the stories we make up.
- How is it used in investing: If we analyse the management and conclude that they are competent and have virtues we admire, we might be more likely to invest in the company. But we should also consider the base rate of companies succeeding in an industry despite having competent management. Many managers know what investors want to hear, but whether they are truly good at running the business is another question. How sure are we that these seemingly competent managers are good at optiming the future cash flows of the company and running the day to day operations?