Adonis Diaries

Posts Tagged ‘man-made complex systems

Antifragile: what can Gain from Disorder (Incerto)?  And Nassim Nicholas Taleb

Nassim Nicholas Taleb, the bestselling author of The Black Swan and one of the foremost thinkers of our time, reveals how to thrive in an uncertain world.

I have reviewed and developed on the Black Swan theory in several articles https://adonis49.wordpress.com/2011/06/01/part-1-black-swan-model-can-rare-catastrophic-events-of-man-made-systems-be-controlled/

Black Swan is a term coined after discovering a black swan a couple of years ago.  People firmly believed that all swans were white:  A few might have observed a black swan but refused to identify it as a swan; or black swans are common sight in particular regions and people had no idea that black swans are considered rarity all over the world and might be purchased for their weight in gold to be raised in zoos!

You know the adage: “If an event can occur, it will happen“, meaning, it does not matter how low the predicted probability of occurrence of the rare events, it will strike “unexpectedly”.

If there is a chance in a million for an asteroid to smash onto earth, an asteroid will fall on our head: Asteroid did fall and transform earth several times in the last four billion years.

Just as human bones get stronger when subjected to stress and tension, and rumors or riots intensify when someone tries to repress them, many phenomena in life benefit from stress, disorder, volatility, and turmoil.

What Taleb has identified and calls “antifragile” is that category of things that not only gain from chaos but need it in order to survive and flourish.

The Black Swan theory states: “In complex systems, especially man-made complex systems, it is not feasible to comprehend all the interactions among the hundred of variables affecting outcomes. In man-made systems, we have to allow natural fluctuations that are at work.

The rare predicted calamitous events  will strike unexpectedly, and we will fail to react accordingly and adequately if we consciously avoid to consistently take them into consideration in our analysis and reports.”

The unexpected events cannot be analyzed as odds in card games or casino games:  Human behavior with thousands of variability in moods, emotions, conventions, conviction, personal experiences… cannot be predicted as games are.

In The Black Swan, Taleb showed us that highly improbable and unpredictable events underlie almost everything about our world.

In Antifragile, Taleb stands uncertainty on its head, making it desirable, even necessary, and proposes that things be built in an antifragile manner.

The antifragile is beyond the resilient or robust. The resilient resists shocks and stays the same; the antifragile gets better and better.

The antifragile is immune to prediction errors and protected from adverse events.

Why is the city-state better than the nation-state,?

Why is debt bad for you, and why is what we call “efficient” not efficient at all?

Why do government responses and social policies protect the strong and hurt the weak?

Why should you write your resignation letter before even starting on the job?

How did the sinking of the Titanic save lives?

The book spans innovation by trial and error, life decisions, politics, urban planning, war, personal finance, economic systems, and medicine.

And throughout, in addition to the street wisdom of Fat Tony of Brooklyn, the voices and recipes of ancient wisdom, from Roman, Greek, Semitic, and medieval sources, are loud and clear.

Antifragile is a blueprint for living in a Black Swan world.

Nassim Nicholas Taleb said:

The most rewarding moment in an author’s career. Finally, I am no longer the author of several books. I am now the author of a single book in 4 volumes: INCERTO, plus technical companions.

Why does it matter? I don’t know, but it is a big, very big deal to see your work as a single large coherent and self contained unit, to which you keep adding pieces to, ultimately, leave very few stones unturned

   

Black Swan model: Can rare catastrophic events in complex man-made systems be controlled?

Note:  The application of the Black Swan model to the “Arab” Spring revolts and in southern Europe, and the financial crisis will be explained in the follow-up article.

Warning! Pay closer attention to the “predictable” but unexpected rare calamitous events!

Black Swan is a term coined after discovering a black swan a couple of years ago.  People firmly believed that all swans were white:  A few might have observed a black swan but refused to identify it as a swan; or black swans are common sight in particular regions and people had no idea that black swans are considered rarity all over the world and might be purchased for their weight in gold to be raised in zoos!

You know the adage: “If an event can occur, it will happen“, meaning, it does not matter how low the predicted probability of occurrence of the rare events, it will strike “unexpectedly”.  If there is a chance in a million for an asteroid to smash onto earth, an asteroid will fall on our head: Asteroid did fall and transform earth several times in the last four billion years.

Just think on the even lower probability of “being who we are, as an individual”.  You could naturally have been an inanimate object, a plant, another animal species, born somewhere else, lived after birth, survived to be 5 year-old…

The Black Swan theory states: “In complex systems, especially man-made complex systems, it is not feasible to comprehend all the interactions among the hundred of variables affecting outcomes. In man-made systems, we have to allow natural fluctuations that are at work.  The rare predicted calamitous events  will strike unexpectedly, and we will fail to react accordingly and adequately if we consciously avoid to consistently take them into consideration in our analysis and reports.”

The unexpected events cannot be analyzed as odds in card games or casino games:  Human behavior with thousands of variability in moods, emotions, conventions, conviction, personal experiences… cannot be predicted as games are.

Natural sciences such as engineering, physics, chemistry, architecture, astronomy, planet explorations…are within the linear domain of thinking life and the universe.  Social and human sciences, epidemics, economics… are within far more complex domains, and the linear methods that mankind was trained to resolving problems and fluctuations are not adequate to be transposed to complex systems.

We are better equipped to predicting lunar eclipses, but not stock evolution, or foreign political upheavals.  It is NOT the “last grain of sand that crashed the structure or the bridge…”  The last grain was the catalyst for the failure but not the cause.  The fault is in the designed system, and not in its components.

For example, the “subprime” was not the cause of the financial crisis in 2008: It was just the latest among the catalysts of hazardous financial tools.  The cause was a faulty financial system that the political decision-makers failed to redesign in due time, requiring courageous and determined positions to ironing-out the serious problems growing out of proportions in risky behaviors, in an unregulated system, and in instantaneous pouring of massive liquidity to “stabilizing” a fragile outmoded designed and faulty system.

There is this trend of confusing catalysts with causes:  The designers of a system do not necessarily have this confusion, but the political decision-makers and owners of the systems that purposely confuse the general public as catastrophes strike.  Two psychological biases are at the sources of confusing catalysts with causes:

The first bias is our illusion in our capacity to control volatility in man-made complex systems. For example, we focus on the “normal working” of a system and we delete from our analysis and reports the minor fluctuations or rare events that are occasionally occurring.  In a sense, if there are no variations, there are no information worth controlling.  This tendency of feeling very comfortable dealing with only a “stable” system leads to forgetting the consequences of calamitous rare events.

The second psychological bias is the illusion that acting on a factor is better than doing nothing and letting the system work-out its fluctuations.  For example, authorities think or are pressured to think that they were elected or appointed to act and react on any variations, instead of doing nothing when fluctuations are within the norm.  Consequently, it is these actions that usually exacerbate a system going bad and out of proportion.  For example, Alan Greenspan and later Ben Bernanke lowering the central bank interest rates to almost negative rates in order to “stabilize” a fragile faulty financial system that needed major redesign.

The Fukushima disaster of the melting down of three nuclear reactors is a typical example.  It is NOT the earthquake and the tsunami that are the causes of the meltdown:  They were the catalysts.  The cause is a faulty designed system for generating electricity that is highly dangerous and built in a region frequently exposed to high levels of earthquakes and tsunami.  The economic risk trade-off was meant for normal functioning of a nuclear plant, and the consequences of  a serious event striking was swept under the carpet for three decades.

The owner of the power nuclear plant and the government blamed natural phenomena as the causes and toned down the lethal exposure to radiation for over a month.  Why?  It is better not scare the people! What?  It is better to let people die peacefully than give them the proper information to decide on their own plan of actions?

It is normal for mankind to be wary of the volatile aspects in life.  In the past, mankind managed to block-out drastic fluctuations from their consciousness in order to survive:  Mankind figured out the major trends in hazard in order to foresee and adopt simple models they could control for administering and managing their lives and the survival of the community.

The behavioral model should allow normal fluctuations in behavior to react within normal realities.

Simple models have been replaced by complex models, but within the past linear mentality and comprehension.  You may understand a few interactions among three main variables, but when man-made design inserted hundreds of volatile factors in a system, we should no longer expect to have total control on the complex system.

If we are not ready to design reasonable fluctuations in a system, and be ready to take seriously the problems of rare occurrences, and be trained to react to calamitous rare events, then it is wise to stick to simple systems that individual operators can understand and can control.

A man-made system should not be designed to eliminate all the faults, ill-behavior, and limitations of mankind, but to factor them in, and be trained to react adequately to these variations:  The operator has to be constantly motivated to learn and be vigilant to minor fluctuations and comprehend the main interactions.

Note 1: Nassim Taleb, a mathematician, was a trader and worked for 20 years as consultant to large investment banks in New York and London. He created Empirica LLC for trading.  He is engineering professor at the polytechnic institute at the University of New York.  Taleb published “Savage hazard” and “The Black Swan:  The power of the unpredictable.”

Note 2: Mark Blyth is a Scottish professor of international political economy at the university of Brown (Rhode Island).  He published “Great Transformation: Economic ideas and institutional change in the 20th century”.  A new book is to be released “Austerity: The history of a dangerous idea


adonis49

adonis49

adonis49

June 2020
M T W T F S S
1234567
891011121314
15161718192021
22232425262728
2930  

Blog Stats

  • 1,384,900 hits

Enter your email address to subscribe to this blog and receive notifications of new posts by email.adonisbouh@gmail.com

Join 732 other followers

%d bloggers like this: