Adonis Diaries

Posts Tagged ‘Savage hazard

Is “Black Swan Theory” applicable to man-made systems?

Has anyone seen a swan (baja3) physically? In the flesh, or even flying or walking?  If you are asked “what is the color of a swan?” I bet your answer is “White, obviously”.  Actually, a black swan was identified a few years ago.  Is it possible to eventually identify a multicolored swan?

You might say that finding a black swan, or even a tribe of black swans, or a mixture of black and white swans stand to reason, but is it feasible to have a green, blue… swan?  You might respond that genetic engineering can produce whatever colored swan you desire as a pet…

Why do you think all of us believed that a swan must be white, and nothing but white?  Most of us have not seen a swan, except in pictures, movies or documentaries; we might not even be able to identify a swan from a duck if the bird is not named…

If even nature, which changes slowly and its trends can be mostly predicted, has the potential of surprising us with rare events, a few of them catastrophic.

We got in the habit of expecting frequent disasters from man-designed and man-made systems, within a few years of their applications and usage by people…

The variability in living creatures and the behaviors of users are a thousand folds more numerous than variability in nature.  Wouldn’t you be appalled in total disbelief to hear any designer of systems claiming that the product is definitely designed and manufactured to be entirely controlled and managed according to users’ satisfaction, safety, and health?

The teams of designers of many professions such as scientists, engineers, psychologist, legal professionals… are aware of two things:

First, there will be frequent minor malfunctions to the system in terms of financial loss, safety and health casualties, but these malfunctions can be controlled and fixed.

Second, any system contains rare catastrophic malfunctions that will eventually occur (doud al khal minho wa fih) and predicting these rare events is very challenging and out of control and management.  When you hear of economic-safety analysis trade-off of a system, bear in mind that the study concerns the number of casualties and the financial cost that owners (more frequently the State or the tax payers) will have to set aside for these calamitous eventualities.

The funny part is that:

First, no money is ever set aside by the private shareholders for these catastrophes and the States or tax-payers will eventually cover up the expenses.

Second, transparency and full disclosure to the general public is never disseminated widely, if ever published.

Third, the public and communities in most countries have no say in the design and decision-making processes of vast man-made systems.

Fourth, no man-made system has instituted an independent specialized and dedicated team responsible of gathering data and analysing statistics of the various malfunctions.  Most malfunctions are barely reported and serious hazardous events are dusted-off under the carpet:  No read, never happened!

Do you know that the UN agency for health is forbidden to collect and report statistics on nuclear disaster consequences?  That the atomic UN agency is not to share statistics with other UN agencies concerned with health and safety of world population?

Note 1: Nassim Nicholas Taleb, a mathematician by formation wrote  “The Black Swan:  The power of the unpredictable” and “Savage hazard”.  Taleb was initially trying to explain the financial crisis since he is in the financial business.  The theory is fine and explains many fluctuations in man-made designs, for example the international financial system.

Note 2: This post is a re-edited version of the first part of a lengthy article related to claims that Black Swan Theory does not apply to the political/social structure in Lebanon

Black Swan model: Can rare catastrophic events in complex man-made systems be controlled?

Note:  The application of the Black Swan model to the “Arab” Spring revolts and in southern Europe, and the financial crisis will be explained in the follow-up article.

Warning! Pay closer attention to the “predictable” but unexpected rare calamitous events!

Black Swan is a term coined after discovering a black swan a couple of years ago.  People firmly believed that all swans were white:  A few might have observed a black swan but refused to identify it as a swan; or black swans are common sight in particular regions and people had no idea that black swans are considered rarity all over the world and might be purchased for their weight in gold to be raised in zoos!

You know the adage: “If an event can occur, it will happen“, meaning, it does not matter how low the predicted probability of occurrence of the rare events, it will strike “unexpectedly”.  If there is a chance in a million for an asteroid to smash onto earth, an asteroid will fall on our head: Asteroid did fall and transform earth several times in the last four billion years.

Just think on the even lower probability of “being who we are, as an individual”.  You could naturally have been an inanimate object, a plant, another animal species, born somewhere else, lived after birth, survived to be 5 year-old…

The Black Swan theory states: “In complex systems, especially man-made complex systems, it is not feasible to comprehend all the interactions among the hundred of variables affecting outcomes. In man-made systems, we have to allow natural fluctuations that are at work.  The rare predicted calamitous events  will strike unexpectedly, and we will fail to react accordingly and adequately if we consciously avoid to consistently take them into consideration in our analysis and reports.”

The unexpected events cannot be analyzed as odds in card games or casino games:  Human behavior with thousands of variability in moods, emotions, conventions, conviction, personal experiences… cannot be predicted as games are.

Natural sciences such as engineering, physics, chemistry, architecture, astronomy, planet explorations…are within the linear domain of thinking life and the universe.  Social and human sciences, epidemics, economics… are within far more complex domains, and the linear methods that mankind was trained to resolving problems and fluctuations are not adequate to be transposed to complex systems.

We are better equipped to predicting lunar eclipses, but not stock evolution, or foreign political upheavals.  It is NOT the “last grain of sand that crashed the structure or the bridge…”  The last grain was the catalyst for the failure but not the cause.  The fault is in the designed system, and not in its components.

For example, the “subprime” was not the cause of the financial crisis in 2008: It was just the latest among the catalysts of hazardous financial tools.  The cause was a faulty financial system that the political decision-makers failed to redesign in due time, requiring courageous and determined positions to ironing-out the serious problems growing out of proportions in risky behaviors, in an unregulated system, and in instantaneous pouring of massive liquidity to “stabilizing” a fragile outmoded designed and faulty system.

There is this trend of confusing catalysts with causes:  The designers of a system do not necessarily have this confusion, but the political decision-makers and owners of the systems that purposely confuse the general public as catastrophes strike.  Two psychological biases are at the sources of confusing catalysts with causes:

The first bias is our illusion in our capacity to control volatility in man-made complex systems. For example, we focus on the “normal working” of a system and we delete from our analysis and reports the minor fluctuations or rare events that are occasionally occurring.  In a sense, if there are no variations, there are no information worth controlling.  This tendency of feeling very comfortable dealing with only a “stable” system leads to forgetting the consequences of calamitous rare events.

The second psychological bias is the illusion that acting on a factor is better than doing nothing and letting the system work-out its fluctuations.  For example, authorities think or are pressured to think that they were elected or appointed to act and react on any variations, instead of doing nothing when fluctuations are within the norm.  Consequently, it is these actions that usually exacerbate a system going bad and out of proportion.  For example, Alan Greenspan and later Ben Bernanke lowering the central bank interest rates to almost negative rates in order to “stabilize” a fragile faulty financial system that needed major redesign.

The Fukushima disaster of the melting down of three nuclear reactors is a typical example.  It is NOT the earthquake and the tsunami that are the causes of the meltdown:  They were the catalysts.  The cause is a faulty designed system for generating electricity that is highly dangerous and built in a region frequently exposed to high levels of earthquakes and tsunami.  The economic risk trade-off was meant for normal functioning of a nuclear plant, and the consequences of  a serious event striking was swept under the carpet for three decades.

The owner of the power nuclear plant and the government blamed natural phenomena as the causes and toned down the lethal exposure to radiation for over a month.  Why?  It is better not scare the people! What?  It is better to let people die peacefully than give them the proper information to decide on their own plan of actions?

It is normal for mankind to be wary of the volatile aspects in life.  In the past, mankind managed to block-out drastic fluctuations from their consciousness in order to survive:  Mankind figured out the major trends in hazard in order to foresee and adopt simple models they could control for administering and managing their lives and the survival of the community.

The behavioral model should allow normal fluctuations in behavior to react within normal realities.

Simple models have been replaced by complex models, but within the past linear mentality and comprehension.  You may understand a few interactions among three main variables, but when man-made design inserted hundreds of volatile factors in a system, we should no longer expect to have total control on the complex system.

If we are not ready to design reasonable fluctuations in a system, and be ready to take seriously the problems of rare occurrences, and be trained to react to calamitous rare events, then it is wise to stick to simple systems that individual operators can understand and can control.

A man-made system should not be designed to eliminate all the faults, ill-behavior, and limitations of mankind, but to factor them in, and be trained to react adequately to these variations:  The operator has to be constantly motivated to learn and be vigilant to minor fluctuations and comprehend the main interactions.

Note 1: Nassim Taleb, a mathematician, was a trader and worked for 20 years as consultant to large investment banks in New York and London. He created Empirica LLC for trading.  He is engineering professor at the polytechnic institute at the University of New York.  Taleb published “Savage hazard” and “The Black Swan:  The power of the unpredictable.”

Note 2: Mark Blyth is a Scottish professor of international political economy at the university of Brown (Rhode Island).  He published “Great Transformation: Economic ideas and institutional change in the 20th century”.  A new book is to be released “Austerity: The history of a dangerous idea


adonis49

adonis49

adonis49

October 2020
M T W T F S S
 1234
567891011
12131415161718
19202122232425
262728293031  

Blog Stats

  • 1,426,555 hits

Enter your email address to subscribe to this blog and receive notifications of new posts by email.adonisbouh@gmail.com

Join 774 other followers

%d bloggers like this: