The Black Swan: The Impact of the Highly Improbable
by N. N. Taleb

Random House, New York 2008.
ISBN 978-1-4000-6351-2. amazon.com 16.17(USD)

Reviewed by  Richard Austin   May 23, 2008 

This is not a book about information security; it is a charming romp through the world of risk assessment (guided by a successful trader from the "Chicago Pits"). Taleb's writing style is pithy, whimsical and full of quotable barbs. The core concept is that of the "Black Swan", an event that is so rare as to be unpredictable, has high consequences when it occurs but, in retrospect, will be explainable ("retroactive explainability"). That final point deserves some emphasis: after a Black Swan occurs, it will be easy to look backward and see all the signs of its approach NOW THAT YOU KNOW WHAT IS GOING TO HAPPEN. Taleb calls this the problem of "silent evidence" - there were actually many signs of what could have become other Black Swans but our search narrows to those that only predict the one that happened and thus render the other indications silent. The message is that this "retroactive explanation" may be of limited use in predicting any future occurrences of Black Swans.

He provides a very useful distinction between the two types of events that are encountered in real life: those that are fairly predictable and those which tend to come as a big (sometimes unpleasant) surprise. He likens the predictable events to the mythical land of Mediocristan, a place where probability distributions are largely Gaussian (I refuse to say "normal"), the mean is a good predictor of most of the time reality and deviations follow a nice decay off into the tails of the distribution, and the much less predictable Extremistan where the mean is largely meaningless and the tails of the distribution are fat. He makes the very valid point that much of the real world lives on the frontier between the two and that humans are notoriously bad at recognizing when they've left the fringes of Mediocristan and wandered into the wilds of Extremistan.

Being somewhat of an academic, I was amused (and stung) by his definitions of the "Ludic Fallacy" as "the attributes of uncertainty we face in real life have little connection to the sterilized ones we encounter in exams and games" (p, 127). Most information security professionals can definitely identify with the "Nerd effect" as "mental elimination of off-model risks or focusing on what you know" (p.151).

Taleb suggests that rather than attempting to imitate the risk assessment processes of the financial markets, we might want to take a closer look at how military planners assess and manage risks (e.g., invest in preparedness rather than prediction).

This book is a good read that will challenge quite a few of our assumptions about how one should approach the process of assessing and managing risk. While there may not be a lot of "solution advice", there are plenty of broad hints as to where the way forward might lie.

Before retiring, Richard Austin was the storage network security architect at a Fortune 25 company and currently earns his bread and cheese as an itinerant university instructor and cybersecurity consultant. He welcomes your thoughts and comments at rda7838 at Kennesaw dot edu