The Black Swan

Nov 5, 2022

So far in my life, no book collection has left me with as many insights as Incerto by Nassim Nicolas Taleb. The second book in the collection—The Black Swanexplores the role of Black Swans, i.e., outliers that carry an extreme impact and have retrospective predictability.

The core message is that a small number of Black Swans explain almost everything in our world, yet we tend to act as if they don’t exist.


How big is your anti-library?

Your library should contain as much of what you do not know as your financial means allow you to put there. The more you know, the larger your collection of unread books (i.e. antilibrary) should be. Why?

The human mind suffers from 3 ailments as it comes into contact with history:

  1. Illusion of understanding: we understand less than we realize and the world is more complicated than we believe.

  2. Retrospective distortion: history seems clearer in books than it is in reality. History does not crawl, it jumps between major, unexpected events.

  3. Platonification: we have a tendency to create categories and oversimplify which leads us to misunderstand the fabric of the world and miss Black Swans.


Two worlds

In Mediocristan, when the sample is large, no single instance will significantly change the aggregate. Matters that belong to Mediocristan (subject to type 1 randomness): height, weight, calorie consumption, car accidents, mortality rates, IQ, income for non-scalable professions etc. Here we must endure the tyranny of the collective, the routine, the obvious, and the predicted.

In Extremistan, inequalities are such that one single observation can disproportionately impact the aggregate. Matters that belong to Extremistan (subject to type 2 randomness): wealth, book sales, earthquake damages, deaths in wars, size of planets, financial markets, economic data, income for scalable professions etc. Here we are subjected to the tyranny of the singular, the accidental, the unseen, and the unpredictable.

Extremistan is where most of the Black Swan action is.


The turkey problem

Consider a turkey that is fed every day. Every single feeding will strengthens its belief that it is the general rule of life to be fed every day by friendly humans looking out for its best interests. On the day before Thanksgiving, something unexpected will happen to the turkey.

The feeling of safety reached its maximum when the risk was at the highest.

From the standpoint of the turkey, the non-feeding of the 1001st day is a Black Swan. For the butcher, it is not, since its occurrence is not unexpected. Hence, the Black Swan is a sucker’s problem: it occurs relative to your expectation.

It is extremely convenient for us to assume that we live in Mediocristan. Such an assumption magically drives away the problem of induction, i.e., the difficulty of generalizing from available information, or of learning from the past, the known, and the seen.

With tools and fools, anything can be easy to find. Anyone can find past instances that corroborate a theory and treat them as evidence. A series of corroborative facts isn’t necessarily evidence.

Seeing a million white swans does not confirm the nonexistence of black swans. However, seeing one black swan certifies that all swans are not white!

After all, we are not naive enough to believe that someone will be immortal because we have never seen him die.


The linear and the nonlinear

Humans have a hunger for rules because we need to reduce the dimension of matters so they can get into our heads. The more you summarize, the more order you put in, the less randomness. Hence, the same condition that makes us simplify pushes us to think that the world is less random than it actually is. Black Swans are what we leave out of simplification.

Our intuitions are not cut our for nonlinearities. Consider the life in a primitive environment where process and result are closely connected. Our mental apparatus is designed for causality. Yet, the world is more nonlinear than we think. Linear progression, a Platonic idea, is not the norm.

Some matters that belong to Extremistan are extremely dangerous but do not appear to be beforehand, since they hide and delay their risks — so suckers think they are “safe”. It is indeed the property of Extremistan to look less risky, in the short run, than it really is.

Losers in history don’t write histories of their experiences. Hence, the neglect of silent evidence is endemic, particularly in activities that are plagued with winner-take-all attributes.

Now consider the cemetery. The graveyard of failed persons will be full of people who shared the following traits: courage, risk taking, optimism etc. Just like the population of millionaires. There may be some difference in skills but what truly separates the two is for the most part plain luck.

Many successful people will try to convince you that their achievements couldn’t be accidental. In these situations, use the reference point argument: don’t compute odds from the vantage point of the winner but from all those who started in the cohort.

We are explanation-seeking animals who tend to think that everything has an identifiable cause and grab the most apparent one as the explanation.

The ludic fallacy refers to the fact that the attributes of real life uncertainty have little connection to the sterilized ones encountered in exams and games.

Those who spend too much time with their noses glued to maps will tend to mistake the map for the territory.


We just can’t predict

The world is far, far more complicated than we think, which is not a problem, except when most of us don’t know it.

Epistemic arrogance bears a double effect: we overestimate what we know, and underestimate uncertainty, by reducing the space of the unknown.

Experts only play a role in some professions. Generally, things that don’t move have experts while things that move have non-experts (empty suits).

Experts who tend to be experts: astronomers, test pilots, chess masters, accountants, physicists, grain inspectors, soil judges…

Experts who tend not to be experts: stockbrokers, college admission officers, political scientists, intelligence analysts, financial forecasters…

Professions that deal with the future and base their studies on the non-repeatable past have an expert problem.

The problem with empty suits is that they don’t know what they don’t know.

No matter what anyone tells you, it is a good idea to question the error rate of an expert’s procedure. Do not question his procedure, only his confidence.

What matters is not how often you are right, but how large your cumulative errors are.

We humans are the victims of an asymmetry in the perception of random events. We attribute our successes to our skills, and our failures to external events outside our control, namely to randomness.

Randomness is just unknowledge. The world is opaque and appearances fool us.


Bell curves in the wrong places

Gaussian bell curve variations face a headwind that makes probabilities drop at a faster rate as you move away from the mean (unlike Mandelbrotian variations).

Measures of uncertainty that are based on the bell curve disregard the possibility, and the impact, of sharp jumps and are, therefore, inapplicable to Extremistan.

The danger is that one can produce data “corroborating” that the underlying process is Gaussian by finding periods that do not have rare events.

We can only use the Gaussian approach in variables for which there is a rational reason for the largest not to be far away from the average.

By not understanding this, we run the biggest risk of all: we handle matters from Extremistan, but treated as if they belonged to Mediocristan, as an “approximation”.

In the last 50 years, the 10 most extreme days in the financial markets represent half the returns. If the world of finance were Gaussian, an episode such as the crash (more than 20 standard deviations) would take place every several billion lifetimes of the universe…

Many people do not understand the elementary asymmetry involved: you need one single observation to reject the Gaussian, but millions of observations will not fully confirm the validity of its application.

Why? Because the Gaussian bell curve disallows large deviations, but tools of Extremistan, the alternative, do not disallow long quiet stretches.


How to get even with the Black Swan

Let us separate the world in two categories. Some people are like the turkey, exposed to a major blowup without being aware of it, while others play reverse turkey, prepared for big events that might surprise others.

Knowing that you cannot predict does not mean that you cannot benefit from unpredictability. Be prepared and maximize the serendipity around you.

Barbell strategy: be as hyper-conservative and hyper-aggressive as possible instead of being mildly aggressive or conservative. Make sure to have plenty of small bets.

Be aggressive when you can gain exposure to positive Black Swans and very conservative when you are under the threat of negative Black Swans. Moreover, be human. Do not try to avoid predicting, just be a fool in the right places. Know how to rank beliefs not according to their plausibility but by the harm they may cause.

Be skeptical about confirmation — though only when errors are costly — not about disconfirmation.

Worry less about small failures, more about large. Moreover, don’t worry about things you can’t do anything about.


Learning from Mother Nature

Mother Nature likes redundancies. It allows you to survive under adversity thanks to the availability of spare parts. The opposite of redundancy is naive optimization and debt. An economist would find it inefficient to maintain two lungs and two kidneys.

Mother Nature does not like anything too big. It does not limit the interactions between entities; it just limits the size of its units.

Mother Nature does not like too much connectivity. Larger environments are more scalable than smaller ones — allowing the biggest to get even bigger, at the expense of the smallest.

The organism with the largest number of secondary uses is the one that will gain the most from environmental randomness and epistemic opacity.

The idea is to let human mistakes and miscalculations remain confined, and to prevent their spreading through the system, as Mother Nature does.


Phronetic rules

  • Have respect for time and non-demonstrative knowledge (Mother Nature).

  • Avoid optimization; learn to love redundancy. Thus, avoid debt and overspecialization.

  • Avoid prediction of small probability payoffs.

  • Beware the “atypicality” of remote events. Past shortfalls don’t predict subsequent shortfalls, so we don’t know exactly what to stress-test for.

  • Avoid some risk metrics. Conventional metrics, based on Mediocristan, adjusted for large deviations, don’t work.

  • Do not confuse absence of volatility with absence of risk. Conventional metrics using volatility as an indicator of stability fool us, because the evolution into Extremistan is marked by a lowering of volatility — and a greater risk of big jumps.


Principles for a society robust to Black Swans

  • What is fragile should break early, while it’s still small.

  • No socialization of losses and privatization of gains.

  • People who drove a bus blindly (and crashed it) should never be given a new bus.

  • Don’t let someone with an “incentive bonus” manage a nuclear plant or financial risks.

  • Compensate complexity with simplicity.

  • Do not give children dynamite sticks, even if they come with a warning label.

  • Do not give an addict more drugs if he has withdrawal pains. Using leverage to cure problems of over-leverage is not homeopathy, it’s denial.

Available from Jan 2025

Back to top

Available from Jan 2025

Back to top

Available from Jan 2025

Back to top