Tail risks
Capital Thinking • Issue #695 • View online
Italian psychologist Massimo Piattelli-Palmarini was once asked why people keep making the same mistakes.
He said:
Inattention, distraction, lack of interest, poor preparation, genuine stupidity, timidity, braggadocio, emotional imbalance, ideological, racial, social or chauvinistic prejudices, and aggressive or prevaricatory instincts.
Let me add some more:
Common Causes of Very Bad Decisions
Morgan Housel | The Collaborative Fund Blog:
Incentives can tempt good people to push the boundaries farther than they’d ever imagine.
Financial boundaries, moral boundaries, all of them. It’s hard to know what you’ll consider doing until someone dangles a huge reward in your face, and underestimating how adjustable the boundaries can become when rewards rise is a leading cause of terrible decisions.
Tribal instincts reduce the ability to challenge bad ideas because no one wants to get kicked out of the tribe.
Tribes are everywhere – countries, states, parties, companies, industries, departments, investment styles, economic philosophies, religions, families, schools, majors, credentials.
Everyone loves their tribe because there’s comfort in knowing other people who understand your background and share your goals.But tribes have their own rules, beliefs, and ideas.
Some of them are terrible. But they remain supported because no one wants to argue with a tribe that’s become part of their identity.
So people either willingly nod along with bad ideas, or become blinded by tribal loyalty to how bad the ideas are to begin with.
Ignoring or underestimating the full range of potential consequences, especially tail events that seem rare but have catastrophic effects.
The most comfortable way to think about risk is to imagine a range of potential consequences that don’t seem like a big deal. Then you feel responsible, like you’re paying attention to risk, but in a way that lets you remain 100% confident and optimistic.
The problem with low-probability tail risks is that they’re so rare you can get away with ignoring them 99% of the time. The other 1% of the time they change your life.
Lots of little errors compound into something huge.
And the power compounding is never intuitive. So it’s hard to see how being a little bit of an occasional jerk grows into a completely poisoned work culture. Or how a handful of small lapses, none of which seem bad on their own, ruins your reputation.
The Great Depression happened because a bunch of things that weren’t surprising (a stock crash, a banking panic, a bad farm year) occurred at the same time and fed on each other until they grew into a catastrophe.
It’s easy to ignore small mistakes, and even easier to miss how they morph into huge ones. So huge ones are what we get.
An innocent denial of your own flaws, caused by the ability to justify your mistakes in your own head in a way you can’t do for others.
When other people’s flaws are easier to spot than your own it’s easy to assume you have no/few flaws, which makes the ones you have more likely to cause problems.
Probability is hard. Black-and-white outcomes are more intuitive.
It’s not easy to put effort into something you’re only 60% sure about. But people want to try their best, so it’s more comfortable to assume that what you’re working on has a 100% chance of success.
That overconfidence breeds stubbornness, excessive risk-taking, and ignoring signs that you’re wrong until it’s too late.
Underestimating the need for room for error, not just financially but mentally.
Benjamin Graham once said:
“The purpose of the margin of safety is to render the forecast unnecessary.”
If you know how hard forecasting is you know how important the quote can be.
And room for error has two sides: whether you can survive an imperfect outcome financially without getting forced out, and whether you can survive it mentally without getting scared out.
Bad decisions happen when there’s only one acceptable version of the future.
*Featured post photo by Becky Sell on Unsplash