Defensive medicine is the practice of performing additional diagnostic tests, administering treatments and even surgical procedures to patients that don’t need treatment. Medical professionals perform these unnecessary medical interventions to reduce the risk that they’ll be sued or banned from practicing medicine due to malpractice.
Overtreatment by doctors adds billions of dollars of extra costs to the healthcare system. One estimate puts the number at $45.6 billion dollars in the United States alone. Ironically, this figure is 80 per cent of the $55.6 billion annual cost of the medical liability system.
In other words, medical professionals spend almost as much on unnecessary medical treatments to avoid litigation as they do in compensation for actual malpractice.
There’s also the impact of defensive medicine on patient health. For example, a patent may be subjected to an unnecessary exploratory procedure such as a biopsy. The procedure results in a short stay in hospital, during which time the patient contracts an infection. The infection results in a deterioration in the patient’s health and a longer stay in hospital.
What causes defensive medicine? And what does this have to do with investing?
Defensive medicine happens because doctors are at risk of making poor treatment decisions in response to the incentives they are faced with. From the doctor’s perspective, an extra test is a small price to pay for a reduced risk of getting sued. Never mind the cost it adds to the healthcare system or the risk it creates for the patient.
Unfortunately, risk means different things to the doctor (getting sued or banned from medicine), the healthcare system (overpaying and overcrowding) and the patient (getting sick or dying). Doctors are very well aware that their treatment decisions will be assessed with the benefit of hindsight; jokingly called the retrospectoscope.
The hindsight bias is one of several behavioural biases, all of which are examples of the problem of information projection. What follows is a brief summary of a lesson taught by Dr Kristof Madarasz as part of the London School of Economics Behavioural Economics and the Modern Economy program which I attended in 2016.
We project information when we act as if the information that we have at the moment is available in other situations or to others, even when its clearly not. Information projection results in several behavioural biases, including:
- Hindsight bias
- Illusion of transparency and problem of social inference it creates
These biases exist because we struggle to consider situations from different perspectives. For example, we repeatedly underestimate the impact of time, information or another person’s viewpoint on how a decision was made.
Any one of these biases can severely impact the quality of our decision-making process. What was really eye-opening for me is the way in which these biases work together to affect tasks that investors must perform repeatedly: performance evaluation and trying to figure out if a result was due to luck or skill.
Let’s briefly define each of these biases before considering an example of how they can affect our investment decision-making.
We underestimate the effects of additional knowledge, the passing of time and a change in circumstances on our beliefs. Hindsight bias is a form of information projection because we’re projecting our current knowledge, beliefs and circumstances onto our past selves and others.
Studies of the medical and legal professions reveal that experts are no better in dealing with hindsight bias than ordinary people.
Curse of Knowledge
We can’t predict how other people will behave if we know something that they don’t. This is because we project our knowledge onto the other person and expect them to act as we would. The curse of knowledge is a particular problem for experts trying to understand the behaviour of non-experts. Who hasn’t read a research paper where the author has presumed a lot of domain-specific knowledge; making it very difficult for a non-specialist to understand? That’s the curse of knowledge in action.
Even more troubling is the observation that we’re likely to interpret a failure to understand by others as a lack of effort, attention, intelligence or skill. In other words, it’s not our fault that other people can’t read our minds, it’s theirs!
Illusion of Transparency
We overestimate the ability of other people to know what we’re thinking and how we feel. For example, studies have shown that we overestimate the ability of other people to tell when we’re lying.
We also tend to expect that most other people share our opinions and preferences. This affects the accuracy of our judgements about others or social inference.
We’re more likely to mistake other people sharing our preferences as a sign that they agree with us instead of a coincidence (which is more likely). We generally like them more as a result. When people don’t share our preferences, we’re more likely to think negatively of them, even though this could also be a coincidence.
Information projection biases have powerful effects in situations where an expert’s performance is evaluated. There is always more information available at the time of evaluation than was available at the time that the decision was made. The newly available information often includes the outcome, making hindsight bias an ever-present danger.
Performance evaluation is usually being made by someone else; for example, a client, a consultant, a committee or a board. They are also going to revise history now that they know the outcome. In other words, you now have both the expert and the reviewer second-guessing the decision (i.e. hindsight squared).
The evaluator obviously knew less at the time the decision was made. Otherwise why would they hire the expert to make decisions for them? However, the evaluator now knows more than the expert (relative to when the expert made the decision) when evaluating the expert’s decision. Remember, not only do they know more, they often know a key piece of information – the outcome!
Let’s assume that the expert is right 55 per cent of the time and wrong 45 per cent of the time and that the pain of being wrong is greater than the joy of being right (i.e. prospect theory). And that the reviewer will be tempted by hindsight to believe that they knew the result all along.
The result is that repeated independent evaluation of the expert over time by the reviewer will result in the reviewer believing the expert doesn’t know anything!
Where did the 55 per cent number come from? A successful fund manager can make a lot of money with an ‘edge’ of only 5 per cent. It’s an edge that most fund managers would be very pleased with. To put a 5 per cent edge into perspective, consider Blackjack. The house has an edge of around 2 per cent against a novice player and around a 0.5 per cent edge against an experienced player following basic strategy. In other words, casinos make a fortune of a much smaller edge.
But even a fund manager with a very healthy 5 per cent edge carries a significant risk of getting sacked due to information projection.
It gets worse. Let’s assume that the client, consultant, committee or board knows less about an investment opportunity than the expert fund manager they’ve hired. This may not always be the case, but it’s usually a safe assumption. Otherwise, why hire the expert?
The information asymmetry means that the expert may suffer the effect of the curse of knowledge. They may act in a certain way because they are convinced that it’s in the client’s best interest. But the client, who we’re assuming is less informed, may not see it that way.
An example of this is a fund manager reducing investment risk. The manager, who we assume has expert knowledge, sees that the risk/return equation is now unfavourable. Consequently, they reduce risk, perhaps allocating part of the portfolio to cash or using derivatives to hedge.
The fund manager imagines that the client would be pleased by their prudence. Instead, the client is really only worried about their returns diverging from peers or the benchmark-relative under-performance created by the fund manager’s actions. This difference in knowledge is then exacerbated by the hindsight bias if the fund manager’s caution proves to be unwarranted.
The illusion of transparency means we’re also at risk of drawing incorrect inferences about the motives behind other people’s decisions. For example, committees and boards may assume that management and staff or the fund managers that they hire understand their risk and return preferences. This may not always be the case.
Groups such as committees and boards are at risk of social loafing; where a smaller subset of individuals make the decisions while the rest of the group just goes along for the ride. Social loafing combined with the illusion of transparency could create a situation where the active members of the group believe that the loafers share their preferences and agree with their decisions when in fact they don’t.
Luck vs Skill
The attribution of outcomes to luck or skill is also affected by the hindsight bias. Attribution depends on whether or not the information acquired with the benefit of hindsight makes the earlier decision seem easy or hard. The attribution algorithm works like this:
- Does the new information make the original decision seem obvious, inevitable or easy? If so, attribute the outcome to luck.
- Does the new information make the original decision seem, less obvious, uncertain or difficult? If so, attribute the outcome to skill.
In reality, neither judgement is correct. Whether or not the information seems trivial or complex is irrelevant because the information didn’t exist at the time the decision was made!
Doctors practice defensive medicine because they anticipate hindsight bias, the curse of knowledge and the illusion of transparency. They are incentivised to act to decrease their information disadvantage vs their future reviewer. Doctors do this in one of two ways:
- They order more tests or medical procedures to substitute for the information that their reviewer will receive in the future.
- They avoid vague information that can later be reinterpreted with the benefit of hindsight as clear information.
Institutional investors are incentivised to practice defensive investing. For example, using an investment consultant could be interpreted as an attempt to substitute an expert opinion for future information. Copying larger, better resourced and ostensibly more sophisticated peers is another way to substitute for information.
Tracking a benchmark is a way to avoid vague information. The performance of individual investments can vary significantly. This variance can be managed by not straying too far from the benchmark.
Combating Information Projection
What can investors do to combat the effects of information projection? A first step is to use a decision journal. Write down all of the factors that contributed to a decision in real-time. Review these notes regularly. A written record makes it impossible to re-write history with the benefit of hindsight. There’s a reason why so many successful investors talk about journals – they work!
Decision making groups should use committee papers and minutes as a decision journal. For example, if a committee or a board is reviewing a decision, they should review the analysis and recommendations that went into the original decision. This should include an honest assessment of what’s changed.
The very fact that decisions are reviewed and changed should cause decision-makers to pause and reflect. Nobody invests in what they believe, at that time, will be a money-losing investment. Every new investment starts out as a winner in the mind of the investor(s) that chose it. And yet we all end up reviewing investment decisions that didn’t work. This is where the temptation to rewrite history is strongest and where a decision journal is most powerful.
Cognitive diversity is also important. This involves more than selecting a group of individuals that are different. To work, cognitive diversity requires (a hat tip to Michael Mauboussin for this framework):
- People who think differently
- Skin in the game
- A mechanism to ensure that divergent views are shared with the group
There has to be decision-making continuity. It’s easier to succumb to information projection when the people making decision change. The illusion of transparency means that decision makers are at risk of making inaccurate inferences about previous decision makers. They are cursed with knowledge that earlier decision makers didn’t have (i.e. knowing the outcome). This is another reason why keeping a decision journal is so important. It preserves the history of a decision for future decision makers.
Decision reviewers such as clients, consultants, committees and boards share some of the responsibility for defensive investing. They have a responsibility to fight against hindsight bias, the curse of knowledge and the illusion of transparency. Otherwise, they quickly create an environment typified by defensive investing.
Sadly, this will be my last monthly column for i3 Insights. I want to thank you, the reader, for sharing this experience with me. Over the last two years we’ve explored a wide range of topics. Writing and podcasting with i3 has been so much fun. I’ve met so many interesting people and been exposed to all sorts of opportunities, all thanks to i3.
I owe a huge debt of gratitude to Teik and Wouter for giving me this opportunity. You believed in me and my ideas. Thank you!
Late last year, I launched Guiscard Capital together with six partners. We launched our first investment strategy, Best Ideas, on 1st March 2019 and so far we’ve delivered for our clients who include high-net-worth investors, family offices, charities and endowments.
Guiscard Capital is growing and is now at stage where I need to give it more time and attention. We’re off to a great start but more work needs to be done.
Thank you to all of the people that have expressed their appreciation for the content that I’ve created together with i3. I really appreciate your feedback and comments. It’s been an honour to share my ideas with you.