Sunday 23 February 2014

WEF 2013 Report on Global Risks: A Different View


The World Economic Forum 2013 report (available here) discusses a wide variety of global risks, their likelihood and potential impact. Risk, however, is a problematic concept. It is not related to any physical quantity and does not follow any laws of physics. It is a highly subjective entity based on another, even more slippery idea, of probability. The most popular definition of risk is this:

Risk = Probability of an event X Consequences

The problem with this definition is twofold:

  • Probability is evaluated either based on ensembles of past events or simulations.
  • The consequences of an event are extremely difficult to estimate.
But even if we have a "perfect" value of probability, its meaning is still difficult to grasp. Imagine two events, A and B. Imagine that the probability of occurring of A is 80%, that of B is 70%. What does that mean? What does it really mean? Does it mean that A will occur before B? Does it mean that the consequences of A will be more severe than that of B? Absolutely not. In actual fact, nobody knows what it means. A probability doesn't give any clue of when, why and with what consequences an event will happen. Bertrand Russel said in 1927:

"Probability is amongst the most important science, not least because no one understands it"

As to consequences of adverse events the situation is similar. Suppose there will be flooding in a certain country next Autumn due to heavy rain. Suppose we know it will happen, so the probability is 100%. What will the consequences be? How many homes will be lost? For how long will the territory be without electricity? How many families will need to be relocated? Ten thousand, fifty thousand, half a million? Depends. It depends on so many factors that any guess is as good as any other guess. So, what is the risk? A billion, two billion? How do you make contingency plans for risks which have unknown consequences and which occur with a probability that is, fundamentally, a concept that is  not understood? How well these contingency plans work is obvious. Every time we witness, for example, natural disasters or humanitarian crises, the keywords are impotence, inefficiency, slow response, angered populations, etc. So much for modern Risk Management.

The WEF produces interesting maps of potential risks, such as this one:


"Probability is amongst the most important science, not least because no one understands it".

Read more: http://www.physicsforums.com
 As the WEF report says, "The Global Risks Report 2013 analyses 50 global risks in terms of impact, likelihood and interconnections, based on a survey of over 1000 experts from industry, government and academia". In other words, the maps are based on subjective views of individuals who are experts in their respective fields, who use their own established models of analysis, simulation, etc.  Clearly, subjective opinions lead to subjective results and conclusions.

A different approach is to adopt an objective model-free Quantitative Complexity Analysis, using real, objective data from sources such as CIA World Fact Book or the World Bank. Processing such data provides something like this:


The above is a Complexity Map, and relates the various parameters (risks/events) to each other in an easy to grasp and analyze format. In fact, the maps is also interactive and may be navigated dynamically. Understanding the various relationships and dependencies between parameters is key towards understanding how the underlying system really works. This is what structure is all about. Knowledge. No structure, no knowledge, just sparse conclusions.

However, the most important result is the actual measure of resilience/robustness of the system (as well as its complexity). In the above case we're talking of just over 50%, a mere two stars. The curious thing is that this measure is very much in line with the resilience of the economy, which today is between 50 and 60% - in other words, very fragile.

An equally important result of a Quantitative Complexity Analysis is the ranking of each of the parameters/risks in terms of their footprint (i.e. objective weight) on the system as a whole. In the case in question it looks like this:


In other words, the ranking of parameters is not based on subjective opinions and surveys, it is not based on statistical or math models, it is based on real and raw data.

Our objective is not analyze in detail Global Risks. What we wish to point out is that when things get really complex, a thousand experts can deliver a thousand opinions, all of which may seem credible and fit the real picture.

The words "resilience", "complexity", "systemic risks", "systems thinking" are increasingly popular. There are numerous studies and publications on these subjects. This is good. See, for example, the WEF's page on national resilience. However, what these studies have in common is lack of a quantitative perspective. Complex? How complex? Resilient? How resilient? 10%, 30%. If we don't incorporate a quantitative dimension into these analyses, which are unquestionably valuable, they will inevitably remain in the sphere of subjectivity.

Let us recall the Principle of Fragility, coined by Ontonix in 2005:

Complexity X Uncertainty = Fragility

While we have often applied the principle to businesses and business processes, it can also be applied to the analysis of Global Risks. Clearly, we are facing a highly Complex situation. The problem at hand is very complex. We also agree on the fact that every expert has his own opinion. As we have said, a highly complex scenario may be interpreted in a plethora of ways. Depending on which expert we talk to, the answer will be different. So, the choice of experts is crucial. Combining, therefore, the complexity of the underlying problem, with the uncertainty originating from a multitude of different and subjective opinions, what we obtain ultimately is a fragile result. Handle with care.



www.ontonix.com




"Probability is amongst the most important science, not least because no one understands it"

Read more: http://www.physicsforums.com

Monday 10 February 2014

Solving Extreme Problems.


Extreme problems are very large-scale multi-disciplinary problems, involving thousands of variables and which cannot be solved using conventional technology. In such situations, it is impossible to determine the cause of the problem not only because of its sheer size but, most importantly, because it is frequently perceived through conventional eyes and distorted by narrow and linear thinking. It is not a matter of compute power or sophisticated math modelling - some things just cannot be modelled.

Examples of extreme problems:

  • Unexpected collapses of critical systems or infrastructures (markets, transportation systems, IT networks, large corporations, etc.)
  • Prolonged states of crisis, inefficiency or frequent system failures (process plants, transportation systems, economies, telephone networks, etc.)
  • Sudden catastrophic collapse (spacecraft, aircraft, software systems, ecosystems, etc.)


Clearly, extreme problems cause extreme consequences and losses.

When it comes to man-made systems, bad design is often the cause. The inability of conventional science to embrace the systems perspective of things on the one hand, and neglecting their complexity on the other provide an efficient barrier to solving extreme problems.

Because in the majority of cases it is excessive and uncontrolled complexity that leads to severe consequences and extreme problems, Ontonix attacks them with its patented model-free Quantitative Complexity Management technology. In collaboration with supercomputer centers, Ontonix provides radically innovative means of formulating and solving extreme problems. We actually measure complexity, identify its sources, performing multi-level Complexity Profiling of systems and sub-systems until a solution is found. In huge and complex systems things often go wrong not because some parameters have the wrong value but because of the countless interactions that may develop. The higher the complexity, the more of such interactions may emerge. What this means is that thanks to high complexity the system in question possesses a marked capacity of producing surprises.

Extreme problems not only pose new challenges. They also stimulate innovative business models. In fact, when Ontonix takes on an extreme problem, the following scheme is followed:

  • The client and Ontonix analyze the problem together.
  • The consequences and losses incurred by the client are quantified.
  • As much data about the problem is gathered as possible.
  • Ontonix employs its best efforts to solve the problem.
  • In case of loss reduction/elimination, a percentage of the client´s gains are paid to Ontonix.




For more information e-mail us.


www.ontonix.com


 

Friday 24 January 2014

Health of the EU Economy - Stagnation or Recovery?


Our quarterly analysis of Eurostat's macroeconomic data of the Eurozone for Q3 2013 has now been completed and published. The interactive Business Structure Maps of each member state may be navigated here.

Italy, together with the UK, Sweden and France have the highest resilience (approximately 80%), while Belgium, Ireland, and Spain score a low 60%.

It is interesting to note that when it comes to the entire region, there are clear indications of a slow but consistent recovery. The evolution of complexity (its increase) in the plot belows shows that clearly.




However, complexity remains dangerously close to critical complexity, denoting alarmingly high fragility. This means that the system is very much exposed and incapable of dealing with intense shocks, financial contagion or extreme events. Nevertheless, it is also evident how, based on the available data, we have hit the bottom around Q4 2011, i.e. approximately two years ago. As of today, the situation in terms of complexity is comparable to that of Q3 2010. In essence, overall situation of the Eurozone has not evolved over the last three years. This is in line with an evident lack of reforms and lack leadership at both EU and country level.

The evolution of resilience follows a similar trend and, although it is still alarmingly low.




Finally, it is interesting to notice how in terms of recovery the 15 core Eurozone states are outpacing the 13 new  member states.




While the crisis peaked in Western Europe in Q4 2007, it climaxed approximately one year later in Central and Eastern Europe. In terms of recovery things are different. While the EU15 group touched the bottom in Q1 2011, the EU13 did so in Q3 2012, i.e. 18 months later. What is also clear is that the complexity gradient (higher complexity means a more lively economy) in the case of the EU13 group is substantially lower than that of the EU15. This means that, based on the currently available data, recovery in Central and Eastern Europe will be significantly slower.



www.ontonix.com                                       www.rate-a-business.com






Thursday 23 January 2014

Creating Fragile Monsters and When Failure Is An Option


Imagine the World as one big corporation, offering all sorts of products and services. If one observes the World, and recognises that most (if not all) things tend states of greater chaos and fragmentation, it may be difficult to reconcile both images. For example, the number of countries, is increasing. Look what happened to Yugoslavia and Czechoslovakia or the Soviet Union. Belgium and Spain will probably be next. Even the European Union itself is being questioned by growing numbers of its disillusioned citizens. The bottom line is that number of players is increasing, their demands and conflicting interests are going to be very difficult to deal with. There are centrifugal forces everywhere. Well, almost everywhere. In fact, as countries and societies tend to break up, there is an equally clear trend in the opposite direction in the corporate world. Consolidations are creating super-huge conglomerates of corporations and super-banks.

Let's look at consolidation in the US over the last two decades. First the media industry.






And the banking industry.





Such super-huge companies have been named as "Too Big To Fail" by Stewart McKinney when he  served on the Banking, Finance and Urban Affairs Committee in 1984. He was wrong. Super-huge companies and banks have failed and without early warning. Size, in this case, doesn't matter. The  problem, in fact, is not so much size as complexity. Excessive complexity to be precise. The new paradigm is "Too Complex To Survive". The enemy is excessive complexity. And why?


  • Highly complex systems are intrinsically hazardous systems.

  • Highly complex systems run in degraded mode.

  • Catastrophe is always just around the corner.


If we allow this mega-monster corporation to emerge, we need to be well aware of the three keywords appearing above: hazardous, degraded, catastrophe. Is this the world we want?


Today, complexity can be measured and managed. It is a fundamental and strategic Key Performance Indicator of any modern business. Ontonix is the first and only company to measure and manage complexity. Rationaly. Serious science starts when you begin to measure.



www.ontonix.com







Tuesday 21 January 2014

Just How Good Are Minimum-Complexity Portfolios?


In order to showcase the performance of minimum-complexity portfolios versus highly complex ones, two such portfolios have been built with stocks from the Dow Jones Index. A nasty period, which included the Internet Bubble, has been chosen: 2000-2004. We confronted the performance of both portfolios with that of the index itself. This has been done in four distinct periods.Here are the results.




In terms of numbers we have the following results:




While the Dow has reported a total loss of 4.2% during the entire 4-year period, the high-complexity portfolio produced gains of 6.3% and the low-complexity one an impressive 24.1%. In turbulence, simpler is better.

Minimum-complexity portfolios may be obtained at www.assetdynex.com







Sunday 12 January 2014

Just How Healthy is the US Economy?


We know that 2013 has been a great year for stock markets, US markets in particular. People are openly talking of "stock market recovery". We also know that the FED has been pumping paper into the system. But what has this really done to the economy, apart from increasing the values of market  indices?

Since Nature offers no free lunch (the economy probably doesn't either) printing money must have its consequence. If you make markets rally based on steroids you inevitably end up paying for it somewhere. We claim that such policies create fragility. Hidden fragility. Well, hidden to conventional pre-crisis analytics technology and to those that are concerned with numbers and numbers alone.

Assetdyne analyzes the major US markets every two weeks and publishes the results here. The focus of the analyses is resilience - the capacity to resist impacts, shocks, contagion, extreme events and, ultimately, sustained turbulence. The results are far from exciting, revealing mediocre levels of resilience. Here they are:


NASDAQ 100 - NDX

S&P 100 - OEX

Dow Jones Composite Average - DJA

Dow Jones Industrial Average - DJI

PHLX Semiconductor - SOX


A two to three-star resilience rating. Nothing to celebrate. The S&P 100, in particular, has an alarming two-star (64%) rating. We leave the comments to the readers.

Navigate Interactive Complexity Maps of the indices here. Just click on an index and move the mouse. More soon.



www.assetdyne.com






Monday 6 January 2014

Complexity Science Helps in Early-detection of Fibrillations and Tachyarrhythmia


The main goal of ONTONET-CRT™ is to reduce detection times as well as unnecessary ICD shocks. ONTONET-CRT™ adopts new model-free technology which does not rely on traditional math models. Instead of conventional analysis ONTONET-CRT™ processes the EGM and computes its complexity. Its sudden fluctuations generally anticipate events such as fibrillations or tachycardias.

ONTONET-CRT™ processing of EGM data indicates that fluctuations of complexity generally precede tachycardias or fibrillations. This means that it is possible to gain precious time in properly detecting and classifying the event and even preventing it altogether.

Analyses of EGMs shows that in over 80% of the cases ONTONET-CRT™ is able to anticipate commencement of tachycardias and fibrillations by a significant number of heart beats. This opens news avenues in terms of dealing with these events even before they commence.

Below is an example of how a sudden increase in complexity precedes a Ventricular Tachycardia.


Read more here.