Nassim Taleb in the New York Times: Black Swan Outliers

Nassim Taleb offers the type of statistical thinking needed to win as a trader. It also happens to be the correct way to properly analyze any unexpected event in life.

Learning to Expect the Unexpected


Nassim Taleb

By Nassim Taleb
The New York Times, April 8, 2004

The 9/11 commission has drawn more attention for the testimony it has gathered than for the purpose it has set for itself. Today the commission will hear from Condoleezza Rice, national security adviser to President Bush, and her account of the administration’s policies before Sept. 11 is likely to differ from that of Richard Clarke, the president’s former counterterrorism chief, in most particulars except one: it will be disputed. There is more than politics at work here, although politics explains a lot. The commission itself, with its mandate, may have compromised its report before it is even delivered. That mandate is “to provide a `full and complete accounting’ of the attacks of Sept. 11, 2001 and recommendations as to how to prevent such attacks in the future.”

It sounds uncontroversial, reasonable, even admirable, yet it contains at least three flaws that are common to most such inquiries into past events. To recognize those flaws, it is necessary to understand the concept of the “black swan.” A black swan is an outlier, an event that lies beyond the realm of normal expectations. Most people expect all swans to be white because that’s what their experience tells them; a black swan is by definition a surprise. Nevertheless, people tend to concoct explanations for them after the fact, which makes them appear more predictable, and less random, than they are. Our minds are designed to retain, for efficient storage, past information that fits into a compressed narrative. This distortion, called the hindsight bias, prevents us from adequately learning from the past.

Black swans can have extreme effects: just a few explain almost everything, from the success of some ideas and religions to events in our personal lives. Moreover, their influence seems to have grown in the 20th century, while ordinary events – the ones we study and discuss and learn about in history or from the news – are becoming increasingly inconsequential.

Consider: How would an understanding of the world on June 27, 1914, have helped anyone guess what was to happen next? The rise of Hitler, the demise of the Soviet bloc, the spread of Islamic fundamentalism, the Internet bubble: not only were these events unpredictable, but anyone who correctly forecast any of them would have been deemed a lunatic (indeed, some were). This accusation of lunacy would have also applied to a correct prediction of the events of 9/11 – a black swan of the vicious variety.

A vicious black swan has an additional elusive property: its very unexpectedness helps create the conditions for it to occur. Had a terrorist attack been a conceivable risk on Sept. 10, 2001, it would likely not have happened. Jet fighters would have been on alert to intercept hijacked planes, airplanes would have had locks on their cockpit doors, airports would have carefully checked all passenger luggage. None of that happened, of course, until after 9/11.

Much of the research into humans’ risk-avoidance machinery shows that it is antiquated and unfit for the modern world; it is made to counter repeatable attacks and learn from specifics. If someone narrowly escapes being eaten by a tiger in a certain cave, then he learns to avoid that cave. Yet vicious black swans by definition do not repeat themselves. We cannot learn from them easily. All of which brings us to the 9/11 commission. America will not have another chance to hold a first inquiry into 9/11. With its flawed mandate, however, the commission is in jeopardy of squandering this opportunity.

The first flaw is the error of excessive and naïve specificity. By focusing on the details of the past event, we may be diverting attention from the question of how to prevent future tragedies, which are still abstract in our mind. To defend ourselves against black swans, general knowledge is a crucial first step. The mandate is also a prime example of the phenomenon known as hindsight distortion. To paraphrase Kirkegaard, history runs forward but is seen backward. An investigation should avoid the mistake of overestimating cases of possible negligence, a chronic flaw of hindsight analyses. Unfortunately, the hearings show that the commission appears to be looking for precise and narrowly defined accountability.

Yet infinite vigilance is not possible. Negligence in any specific case needs to be compared with the normal rate of negligence for all possible events at the time of the tragedy ‹ including those events that did not take place but could have. Before 9/11, the risk of terrorism was not as obvious as it seems today to a reasonable person in government (which is part of the reason 9/11 occurred). Therefore the government might have used its resources to protect against other risks ‹ with invisible but perhaps effective results.

The third flaw is related. Our system of rewards is not adapted to black swans. We can set up rewards for activity that reduces the risk of certain measurable events, like cancer rates. But it is more difficult to reward the prevention (or even reduction) of a chain of bad events (war, for instance). Job-performance assessments in these matters are not just tricky, they may be biased in favor of measurable events. Sometimes, as any good manager knows, avoiding a certain outcome is an achievement. The greatest flaw in the commission’s mandate, regrettably, mirrors one of the greatest flaws in modern society: it does not understand risk. The focus of the investigation should not be on how to avoid any specific black swan, for we don’t know where the next one is coming from. The focus should be on what general lessons can be learned from them. And the most important lesson may be that we should reward people, not ridicule them, for thinking the impossible. After a black swan like 9/11, we must look ahead, not in the rear-view mirror.

Nassim Nicholas Taleb, the founder of a risk research and trading firm, is the author of “Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets.”

Trend Following Products

Review trend following systems and training:

Michael Covel Trend Following Products
Michael Covel Trend Following Products

More info here.