Sunday 23 February 2014

WEF 2013 Report on Global Risks: A Different View


The World Economic Forum 2013 report (available here) discusses a wide variety of global risks, their likelihood and potential impact. Risk, however, is a problematic concept. It is not related to any physical quantity and does not follow any laws of physics. It is a highly subjective entity based on another, even more slippery idea, of probability. The most popular definition of risk is this:

Risk = Probability of an event X Consequences

The problem with this definition is twofold:

  • Probability is evaluated either based on ensembles of past events or simulations.
  • The consequences of an event are extremely difficult to estimate.
But even if we have a "perfect" value of probability, its meaning is still difficult to grasp. Imagine two events, A and B. Imagine that the probability of occurring of A is 80%, that of B is 70%. What does that mean? What does it really mean? Does it mean that A will occur before B? Does it mean that the consequences of A will be more severe than that of B? Absolutely not. In actual fact, nobody knows what it means. A probability doesn't give any clue of when, why and with what consequences an event will happen. Bertrand Russel said in 1927:

"Probability is amongst the most important science, not least because no one understands it"

As to consequences of adverse events the situation is similar. Suppose there will be flooding in a certain country next Autumn due to heavy rain. Suppose we know it will happen, so the probability is 100%. What will the consequences be? How many homes will be lost? For how long will the territory be without electricity? How many families will need to be relocated? Ten thousand, fifty thousand, half a million? Depends. It depends on so many factors that any guess is as good as any other guess. So, what is the risk? A billion, two billion? How do you make contingency plans for risks which have unknown consequences and which occur with a probability that is, fundamentally, a concept that is  not understood? How well these contingency plans work is obvious. Every time we witness, for example, natural disasters or humanitarian crises, the keywords are impotence, inefficiency, slow response, angered populations, etc. So much for modern Risk Management.

The WEF produces interesting maps of potential risks, such as this one:


"Probability is amongst the most important science, not least because no one understands it".

Read more: http://www.physicsforums.com
 As the WEF report says, "The Global Risks Report 2013 analyses 50 global risks in terms of impact, likelihood and interconnections, based on a survey of over 1000 experts from industry, government and academia". In other words, the maps are based on subjective views of individuals who are experts in their respective fields, who use their own established models of analysis, simulation, etc.  Clearly, subjective opinions lead to subjective results and conclusions.

A different approach is to adopt an objective model-free Quantitative Complexity Analysis, using real, objective data from sources such as CIA World Fact Book or the World Bank. Processing such data provides something like this:


The above is a Complexity Map, and relates the various parameters (risks/events) to each other in an easy to grasp and analyze format. In fact, the maps is also interactive and may be navigated dynamically. Understanding the various relationships and dependencies between parameters is key towards understanding how the underlying system really works. This is what structure is all about. Knowledge. No structure, no knowledge, just sparse conclusions.

However, the most important result is the actual measure of resilience/robustness of the system (as well as its complexity). In the above case we're talking of just over 50%, a mere two stars. The curious thing is that this measure is very much in line with the resilience of the economy, which today is between 50 and 60% - in other words, very fragile.

An equally important result of a Quantitative Complexity Analysis is the ranking of each of the parameters/risks in terms of their footprint (i.e. objective weight) on the system as a whole. In the case in question it looks like this:


In other words, the ranking of parameters is not based on subjective opinions and surveys, it is not based on statistical or math models, it is based on real and raw data.

Our objective is not analyze in detail Global Risks. What we wish to point out is that when things get really complex, a thousand experts can deliver a thousand opinions, all of which may seem credible and fit the real picture.

The words "resilience", "complexity", "systemic risks", "systems thinking" are increasingly popular. There are numerous studies and publications on these subjects. This is good. See, for example, the WEF's page on national resilience. However, what these studies have in common is lack of a quantitative perspective. Complex? How complex? Resilient? How resilient? 10%, 30%. If we don't incorporate a quantitative dimension into these analyses, which are unquestionably valuable, they will inevitably remain in the sphere of subjectivity.

Let us recall the Principle of Fragility, coined by Ontonix in 2005:

Complexity X Uncertainty = Fragility

While we have often applied the principle to businesses and business processes, it can also be applied to the analysis of Global Risks. Clearly, we are facing a highly Complex situation. The problem at hand is very complex. We also agree on the fact that every expert has his own opinion. As we have said, a highly complex scenario may be interpreted in a plethora of ways. Depending on which expert we talk to, the answer will be different. So, the choice of experts is crucial. Combining, therefore, the complexity of the underlying problem, with the uncertainty originating from a multitude of different and subjective opinions, what we obtain ultimately is a fragile result. Handle with care.



www.ontonix.com




"Probability is amongst the most important science, not least because no one understands it"

Read more: http://www.physicsforums.com

Monday 10 February 2014

Solving Extreme Problems.


Extreme problems are very large-scale multi-disciplinary problems, involving thousands of variables and which cannot be solved using conventional technology. In such situations, it is impossible to determine the cause of the problem not only because of its sheer size but, most importantly, because it is frequently perceived through conventional eyes and distorted by narrow and linear thinking. It is not a matter of compute power or sophisticated math modelling - some things just cannot be modelled.

Examples of extreme problems:

  • Unexpected collapses of critical systems or infrastructures (markets, transportation systems, IT networks, large corporations, etc.)
  • Prolonged states of crisis, inefficiency or frequent system failures (process plants, transportation systems, economies, telephone networks, etc.)
  • Sudden catastrophic collapse (spacecraft, aircraft, software systems, ecosystems, etc.)


Clearly, extreme problems cause extreme consequences and losses.

When it comes to man-made systems, bad design is often the cause. The inability of conventional science to embrace the systems perspective of things on the one hand, and neglecting their complexity on the other provide an efficient barrier to solving extreme problems.

Because in the majority of cases it is excessive and uncontrolled complexity that leads to severe consequences and extreme problems, Ontonix attacks them with its patented model-free Quantitative Complexity Management technology. In collaboration with supercomputer centers, Ontonix provides radically innovative means of formulating and solving extreme problems. We actually measure complexity, identify its sources, performing multi-level Complexity Profiling of systems and sub-systems until a solution is found. In huge and complex systems things often go wrong not because some parameters have the wrong value but because of the countless interactions that may develop. The higher the complexity, the more of such interactions may emerge. What this means is that thanks to high complexity the system in question possesses a marked capacity of producing surprises.

Extreme problems not only pose new challenges. They also stimulate innovative business models. In fact, when Ontonix takes on an extreme problem, the following scheme is followed:

  • The client and Ontonix analyze the problem together.
  • The consequences and losses incurred by the client are quantified.
  • As much data about the problem is gathered as possible.
  • Ontonix employs its best efforts to solve the problem.
  • In case of loss reduction/elimination, a percentage of the client´s gains are paid to Ontonix.




For more information e-mail us.


www.ontonix.com