Risk and rationality

The role of industrial strategy policy in business growth and innovation
20th February 2020

Risk and rationality

What is rationality? What is rational and what is irrational? The philosopher can expand on this for eternity. The ongoing Sars-cov2 pandemic is forcing millions of people to re-think what is rational and what is not. Is the lockdown of cities and the strict quarantine rational response? Some think it is not, or at least they thought so until they didn’t.

"Survival comes first, truth, understanding, and science later"

Nassim N. Taleb

There is a certain subjectivity attached to the notion of rational behaviour. What is rational can be subject to situational conditions, such as for example drinking a couple glasses of wine; if you are relaxing at home it is fine, even great and rather rational, but if you are about to drive your car then it is not so fine, not at all great and rather irrational. This is often characterised as risky behaviour. Evidently, -risk- and -rationality- are strongly associated. However, risk and risk management are studied nowadays as practical skills in schools of business and finance, while rationality is studied in schools of philosophy. This duality is often the cause of much confusion, particularly to the passionate advocates of the one school of thought who largely ignore or undervalue the teachings of the other. It is often the case that the typical academic will understand risk very well in theory, but has never practiced the art or techne of risk taking. Then, there are many practitioners of risk such as traders or insurers, who have a good grasp of the practicalities of risk, but often lack the theoretical background to grasp risk beyond what their software output suggests. There are of course a few in both groups that have both extensive theoretical knowledge and a solid understanding of risk from their personal experience in risk taking activities. Those few however, are the exception and not the rule.

People often confuse rational and irrational behaviour due to optimistic views, hidden risks or asymmetries of information. As Donald Rumsfeld suggested there are known knowns, known unknowns, and unknown unknowns. Much of the scientific research taking place in research institutions around the world is concerned with investigating known unknowns; that is the things we know that we do not yet know. Typically, theories will form testable hypotheses. Finding from these tests will be within a range of known possibilities that will be interpreted by the theorists. However, cognitive biases play an important role in the interpretations of findings, particularly when the findings are somehow surprising; that is, there were unknown unknowns affecting the outcome.

"There are known knowns. There are things we know that we know. There are known unknowns. That is to say, there are things that we now know we don't know. But there are also unknown unknowns. There are things we do not know we don't know"

Donald Rumsfeld

What is more, people with incomplete information will not only form erroneous opinions, but they will be more confident in those opinions than others with more complete information. Daniel Kahneman calls this “What You See Is All There Is”, or WYSIATI for short. He explains that “Much of the time, the coherent story we put together is close enough to reality to support reasonable action. However, I will also invoke WYSIATI to help explain a long and diverse list of biases of judgment and choice”. Kahneman and his colleague Amos Tversky spent years running experiments to finally conclude that the average person has a weak grasp of probabilities. He also suggests that people will tend to either ignore highly unlikely events or overweight them in their decision making. He explains that people’s decisions are often influenced by emotions and vivid descriptions, which result in overestimation or underestimation of probabilities of rare events. The probability of a rare event” he explains, “is most likely to be overestimated when the alternative is not fully specified”. Additionally, he shows that people will erroneously think that a conjunction of events has a higher probability to occur than one of the events alone. Robert Schiller has shown that even trained risk practitioners and company managers often do not perform any better than the average person.

Nassim N. Taleb has written extensively about risk-taking and uncertainty. One of his mantras is that when ruin (e.g. financial ruin or death) is a probability, then there is no meaningful cost-benefit analysis of risk taking, since repeated exposure to such risk will eventually end in ruin. Hence, taking such a risk would be deemed irrational. He gives an example to show the difference between smoking one cigarette and smoking repeatedly. He suggests that “Smoking a single cigarette is extremely benign, so a cost-benefit analysis would deem one irrational to give up so much pleasure for so little risk!”, explaining that “it is the act of smoking that kills, with a certain number of packs per year, tens of thousands of cigarettes –in other words, repeated serial exposure”. Ruin he claims is “indivisible and invariant to the source of randomness that may cause it” and that “all it takes is a single event for the irreversible exit from among us”.

"Is selective paranoia “irrational” if those individuals and populations who don’t have it end up dying or extinct, respectively?"

Nassim N. Taleb

So how can we use this theoretical framework to examine the situation the world is facing with the SARS-CoV2 pandemic? Looking at the illustration, I define four area spaces as A, B, C and D. Our everyday life takes place in area A, where most things happening are typically likely/probable to happen and not so unthinkable. There are some moderate surprises/risk, but nothing too extreme. Most of things happening in A will be well factored in the economy, politics and the financial markets. Almost everything happening in A would be rational and not so unexpected. If we increase the impossible events taking place in this world of our illustration, we begin to move from A into the territory of area B. As the number of improbable events increase the risk of decision making as well as the risk attached to these decisions also increases. The probability of an event ranges from 0 to 1, where the certain event equals 1 and the 0 the opposite. However, improbable does not equate with unthinkable. This is because if we can assign a probability to an event, even if this probability is 0, then this means we have thought of such event occurring. In practice, risk assessment will attempt to create scenarios of various probability events from low to high probability and set the response actions for each event occurring. The unthinkable event on the other hand lies beyond such calculations and it is not part of the assessment. Essentially as we move from A into the territory of area C the more our thought process becomes subject of philosophy and fiction, unless experience teaches us otherwise.

"to preserve the distinction which has been drawn between the measurable uncertainty and an unmeasurable one we may use the term “risk” to designate the former and the term “uncertainty” for the latter."

Frank H. Knight

Things get really complicated when we move to the territory of area D. Events in D are characterised by high risk unknown unknowns, and in D anything is possible. D is the area where ‘black swans’ live. The area where we do not think that an event can occur until it does. An example of such extreme situation would be the 9/11 airplane crash on the World Trade centre. That was an unthinkable event. Such an event was not part of any set of response actions or counter measures (e.g. there are no anti-air missile systems installed on the rooftop of skyscrapers). One would say that of course such military installations would be irrational in Manhattan. On the other hand, if you look at pictures from war-zone areas you might notice anti-air missile systems on the rooftops. For these areas the unthinkable becomes thinkable and the response to install such systems on rooftops becomes rational. What is more, the now declassified report on the 9/11 attack cites the "failure of imagination" as the reason for not taking precautionary measures against such a destructive event regardless of how unexpected or unlikely the event may be.

"Across the government, there were failures of imagination, policy, capabilities, and management."

The 9/11 Commission Report

Reality is essentially what forms and informs our rational behaviour. It is what creates the subjective dichotomy between what is rational and what is irrational. In the case of the SARS-CoV2 pandemic one can see how the judgement of what was labelled irrational shifted to become rational as the extremity of the pandemic unfolded. Closed stores, home quarantine, millions working from home, mass graves etc. The unthinkable became thinkable and probabilities began to be assigned to different scenarios. The two meters distance from others becomes the new normal, as is wearing a mask. But was the SARS-CoV2 pandemic truly a so called ‘black swan’? Not so much. There have been a few researchers who attempted to alert the scientific community to the dangers of such a pandemic as far back as 2007. For example, Cheng, Lau, Woo and Yuen wrote in a 2007 article published in Clinical microbiology reviews about the risk in a reemergence of SARS and other novel viruses, calling for the need to prepare for it. Health institutions and governments around the world chose to ignore or downplay the importance of these risks. This resonates with Kahneman’s findings that people will tend to ignore highly unlikely events or overestimate them. In the case of the SARS-CoV2 pandemic one can see the dual ignoring-overestimating effect when looking at the timeline of the spread.

"The possibility of the reemergence of SARS and other novel viruses from animals or laboratories and therefore the need for preparedness should not be ignored"

Cheng, Lau, Woo and Yuen (2007)

However, our rationality-irrationality dichotomy shifts with the changes in our environment. Today we see others shouting to people who do not stay home as the quarantine dictates, or who do not keep the two meters distance and come to close to others. What is normal in one setting might not be normal in another setting. If anything, what is normal is what others around us do. Yet, that which was normal yesterday is not normal today, and what is irrational today may well dictate the normality of tomorrow. Essentially, as we shift to a new set reality and understanding the area A expands towards infinity in both directions to incorporate increasingly more of B, C and D areas. This has been happening since the beginning of time; that is what is today in A was not always there. In 18th century Britain for example it was unthinkable for common people to vote in elections. And until recently it was unthinkable for women to drive in some parts of the world. Think of the exploration of space which went from observing the night sky to landing spacecrafts on Mars, or the high probability of dying from several illnesses a 100 years ago that now are treatable or preventable with vaccination.

Events like wars and pandemics force us to stretch our imagination to reassess the risks of improbable events and reconsider definitions of normality. The certainty is that the economy will need to adjust and adapt to a new normal. Businesses will need to adapt as well. The great shift towards online commerce and social space seems inevitable now more than ever. But as extremities create hazardous conditions for most, they also force greater efforts for innovation and invention. A rethinking of what is of value and what is not, becomes necessary and a rethinking of models, concepts and theories becomes essential.

"Uncle Zeke was known in my Kentucky home town for his wisdom. One day a young friend asked him,

'Uncle Zeke, how come you’re so wise?'

'Because I’ve got good judgment,' the old man replied.

'And where does good judgment comes from?' asked again the young friend.

'Good judgment comes from experience, and experience — well, that comes from poor judgment! '"

References

Cheng, V. C., Lau, S. K., Woo, P. C., & Yuen, K. Y. (2007). Severe acute respiratory syndrome coronavirus as an agent of emerging and reemerging infection. Clinical microbiology reviews, 20(4), 660-694.

Kahneman, D., (2011). Thinking, fast and slow. Macmillan.

Knight, Frank H., (1921). Risk, Uncertainty and Profit. Reprinted 1994, Augustus M. Kelley, New York

Shiller, R.J. (2001). Irrational exuberance. Princeton university press.

Taleb, Nassim N. (2017). How to be Rational about Rationality

Taleb, Nassim N. (2016). The logic of risk taking.

The 9/11 Commission Report, (2004). https://govinfo.library.unt.edu/911/report/911Report_Exec.htm

Leave a Reply

Your email address will not be published. Required fields are marked *