Friday 7 March 2014

Innovation in Finance is Possible




The rating, a parameter that reflects the state of health of a company, occupies a central position in the world economy . Users of ratings are private and institutional investors, brokers, traders, analysts, and, lately, even politicians. Ratings, which are used essentially to decide which companies to invest in, are free and are published in newspapers and on the web. The process leading to the definition of the rating of a company is long and complex and has long been the focus of much discussion and controversy since it is often paid by the same companies being valued. Despite the conflict of interest is obvious, the largest international agencies - Moody's, Standard & Poor 's and Fitch – continue to hold a rating monopoly. Who controls the rating agencies - largely companies that manage huge investment funds - has immense power. Everybody knows that. Everybody continues to use ratings.

We have always maintained that the rating process of a firm should be more transparent, objective and, above all, accessible even for the smallest of businesses. With this goal in mind we have launched the World’s first system ' self- rating' system - Rate-A-Business - which allows anyone to upload data from the financial statements of a company and to obtain in a few seconds a measure of its state of health. The system works for both listed companies as well as for those not present on stock exchanges. In essence, the tool moves the process of rating from the agencies to the Internet, making it more "democratic", fast and easily accessible by investors or even by smaller companies . In other words, the rating is transformed from a luxury to 'commodity'. More than that, it becomes a useful tool in managing a business. This is the philosophy that inspires the Rate- A- Business platform.

How does Rate-A-Business work?

To obtain the rating with Rate-A-Business, whether it is for a listed companies or not, one must have quarterly data from the company’s income statement, cash flow, balance sheet or ratios. It is advisable to use data for the last 12 quarters (three years). A small example is shown below (the data is fictitious).


 
Once data has been uploaded the system processes the numbers, establishes the inter-dependencies between the various entries and measures the overall amount of “chaos” (uncertainty) contained within the data. Data entries with highly random or chaotic evolution are a reflection of a business that is not predictable and therefore difficult to manage, as in the example below.
 

Clearly, the more entries behave chaotically the more the company in question is vulnerable. Additionally, if these items are related to each other, the company is heavily exposed given that a problem with either of them can spread rapidly throughout the system. The degree of chaos that a company is able to withstand is called "critical complexity ". Near this threshold the trends of the various data entries is so uncertain and unpredictable that the company is virtually uncontrollable. Clearly, if, for example, the balance sheet entries have evolved chaotically , it is easy to imagine similar performance in terms of sales, production and, eventually, across the entire enterprise. So, the more a company functions far from threshold of its "critical complexity" it is more healthy. This, in summary, is the spirit of the new rating system available on the Rate A Business platform: measure the distance that separates a company from a state of "total chaos". 

How does Assetdyne works? 
In the case of listed companies, the Assetdyne platform is connected in real time to different stock markets and collects the closing values of different securities. Using a technique similar to auto-correlation, the system assembles a table with those values going back a number of days. The system, at this point, measures the distance of these data from a state of "total chaos" (i.e. critical complexity ). The more the data in question are complex the more the evolution of the price of those securities results fragile and unpredictable. The operation of the system is very easy - one enters a ticker symbol and in a matter of seconds the system provides a measure of the complexity and resilience of its daily evolution.

As mentioned, high complexity implies chaos. As complexity approaches "critical complexity" the dynamics of the price per share becomes uncertain and therefore less resilient. To give an example, think of the value of cholesterol, which should be kept at a certain distance from a maximum value established by our physician. Near this threshold, our health is at risk. The situation with stocks and their dynamics is similar. If the trend is more complex (chaotic ) this increases the possibility of surprises and, therefore, the level of exposure. Complexity, therefore, provides a new measure of volatility (variability) which is not based on the conventional concepts of normality or linear correlation and which are questionable in a highly turbulent regime.


Resilience, on the other hand , which measures the ability to absorb shocks and extreme events, ranges from 0 % to 100%. The closer you get to 100 % the more predictable and stable the situation is. Low value of resilience point to situations which are chaotic, hence difficult to predict. Two examples of complexity and resilience rating systemic European banks are illustrated below:


Intesa Sanpaolo (ISPM.MI)
previous close
 1.83
stock complexity
 12.48
stock resilence
 85.12%


 
Credit Suisse Group (CS)
previous close
 31.05
stock complexity
 28.91
stock resilence
 73.78%


 

Portfolio ratings
Assetdyne’s rating platform is also applicable to portfolios of securities. Given that the computation of portfolio complexity is based on the closing prices at the end of the day, its value changes on a daily basis. It is, ultimately, a high-frequency portfolio rating system. It should be remembered that traditional ratings are generally issued once a year, when companies publish their consolidated financial statements. Given the speed of the economy, this may be inadequate.

One of the objectives of Assetdyne’s rating system is to indicate which are the securities or products which make a portfolio highly complex and which should be avoided by less experienced investors. What is surprising is that we all know that highly complex products are risky and yet no one has ever measured their actual complexity. Assetdyne does just that - for the first time we measure the complexity of stocks, portfolios of shares or other financial products.

As a final comment, one might conclude that stock markets constitute a huge social network in which millions of people participate in a global "game" called trading. One of the results of this game is the real-time share price of all the listed companies and which affects immediately the world’s economy. It is important, therefore, that every participant be aware of those financial products which hide high complexity, the most formidable source of fragility and risk. 

*Assetdyne, has developed a new rating system for listed companies seeking to make the ratings as objective as possible by addressing the problem of potentially unreliable financial statements. This system is based on the concept of 'crowd- rating ' and has its roots in the stock market.
The value of the shares of a company is the result of a complex interaction of millions of traders, analysts, investors, trading robots, etc. Ultimately, it is a reflection of the reputation and the perceived value of a company and is the result of a collective and 'democratic' process. Clearly, the value of a security is also driven by market trends, industry analysis, rumors, insider trading and other illegal practices and, of course, by the rating agencies. However, undeniably, it is the millions of investors who ultimately drive the price and the dynamics of the securities in accordance with the fundamental principles of supply and demand.

Assetdyne uses information about the daily value and the dynamics of the stock price of a company to actually calculate its rating. The rating that is calculated in this manner does not reflect the probability of default (i.e. the probability of bankruptcy) of a particular company - this is what a traditional rating produces, an AAA , BBB or CCC for instance - it reflects the complexity (degree of chaos) of the dynamics of its stock. This is very important for a number of reasons. Stocks with very complex dynamics are far more unpredictable than those with simpler dynamics. Highly complex and volatile dynamics are able to surprise investors, very often at the worst moment in time. It so happens that our economy and stock markets are not only very turbulent and chaotic , but also extremely complex. A rating based on the complexity of the stocks of listed companies reflects, therefore, the hallmark of our times - complexity.
 
 
 
 
 
 

Sunday 23 February 2014

WEF 2013 Report on Global Risks: A Different View


The World Economic Forum 2013 report (available here) discusses a wide variety of global risks, their likelihood and potential impact. Risk, however, is a problematic concept. It is not related to any physical quantity and does not follow any laws of physics. It is a highly subjective entity based on another, even more slippery idea, of probability. The most popular definition of risk is this:

Risk = Probability of an event X Consequences

The problem with this definition is twofold:

  • Probability is evaluated either based on ensembles of past events or simulations.
  • The consequences of an event are extremely difficult to estimate.
But even if we have a "perfect" value of probability, its meaning is still difficult to grasp. Imagine two events, A and B. Imagine that the probability of occurring of A is 80%, that of B is 70%. What does that mean? What does it really mean? Does it mean that A will occur before B? Does it mean that the consequences of A will be more severe than that of B? Absolutely not. In actual fact, nobody knows what it means. A probability doesn't give any clue of when, why and with what consequences an event will happen. Bertrand Russel said in 1927:

"Probability is amongst the most important science, not least because no one understands it"

As to consequences of adverse events the situation is similar. Suppose there will be flooding in a certain country next Autumn due to heavy rain. Suppose we know it will happen, so the probability is 100%. What will the consequences be? How many homes will be lost? For how long will the territory be without electricity? How many families will need to be relocated? Ten thousand, fifty thousand, half a million? Depends. It depends on so many factors that any guess is as good as any other guess. So, what is the risk? A billion, two billion? How do you make contingency plans for risks which have unknown consequences and which occur with a probability that is, fundamentally, a concept that is  not understood? How well these contingency plans work is obvious. Every time we witness, for example, natural disasters or humanitarian crises, the keywords are impotence, inefficiency, slow response, angered populations, etc. So much for modern Risk Management.

The WEF produces interesting maps of potential risks, such as this one:


"Probability is amongst the most important science, not least because no one understands it".

Read more: http://www.physicsforums.com
 As the WEF report says, "The Global Risks Report 2013 analyses 50 global risks in terms of impact, likelihood and interconnections, based on a survey of over 1000 experts from industry, government and academia". In other words, the maps are based on subjective views of individuals who are experts in their respective fields, who use their own established models of analysis, simulation, etc.  Clearly, subjective opinions lead to subjective results and conclusions.

A different approach is to adopt an objective model-free Quantitative Complexity Analysis, using real, objective data from sources such as CIA World Fact Book or the World Bank. Processing such data provides something like this:


The above is a Complexity Map, and relates the various parameters (risks/events) to each other in an easy to grasp and analyze format. In fact, the maps is also interactive and may be navigated dynamically. Understanding the various relationships and dependencies between parameters is key towards understanding how the underlying system really works. This is what structure is all about. Knowledge. No structure, no knowledge, just sparse conclusions.

However, the most important result is the actual measure of resilience/robustness of the system (as well as its complexity). In the above case we're talking of just over 50%, a mere two stars. The curious thing is that this measure is very much in line with the resilience of the economy, which today is between 50 and 60% - in other words, very fragile.

An equally important result of a Quantitative Complexity Analysis is the ranking of each of the parameters/risks in terms of their footprint (i.e. objective weight) on the system as a whole. In the case in question it looks like this:


In other words, the ranking of parameters is not based on subjective opinions and surveys, it is not based on statistical or math models, it is based on real and raw data.

Our objective is not analyze in detail Global Risks. What we wish to point out is that when things get really complex, a thousand experts can deliver a thousand opinions, all of which may seem credible and fit the real picture.

The words "resilience", "complexity", "systemic risks", "systems thinking" are increasingly popular. There are numerous studies and publications on these subjects. This is good. See, for example, the WEF's page on national resilience. However, what these studies have in common is lack of a quantitative perspective. Complex? How complex? Resilient? How resilient? 10%, 30%. If we don't incorporate a quantitative dimension into these analyses, which are unquestionably valuable, they will inevitably remain in the sphere of subjectivity.

Let us recall the Principle of Fragility, coined by Ontonix in 2005:

Complexity X Uncertainty = Fragility

While we have often applied the principle to businesses and business processes, it can also be applied to the analysis of Global Risks. Clearly, we are facing a highly Complex situation. The problem at hand is very complex. We also agree on the fact that every expert has his own opinion. As we have said, a highly complex scenario may be interpreted in a plethora of ways. Depending on which expert we talk to, the answer will be different. So, the choice of experts is crucial. Combining, therefore, the complexity of the underlying problem, with the uncertainty originating from a multitude of different and subjective opinions, what we obtain ultimately is a fragile result. Handle with care.



www.ontonix.com




"Probability is amongst the most important science, not least because no one understands it"

Read more: http://www.physicsforums.com

Monday 10 February 2014

Solving Extreme Problems.


Extreme problems are very large-scale multi-disciplinary problems, involving thousands of variables and which cannot be solved using conventional technology. In such situations, it is impossible to determine the cause of the problem not only because of its sheer size but, most importantly, because it is frequently perceived through conventional eyes and distorted by narrow and linear thinking. It is not a matter of compute power or sophisticated math modelling - some things just cannot be modelled.

Examples of extreme problems:

  • Unexpected collapses of critical systems or infrastructures (markets, transportation systems, IT networks, large corporations, etc.)
  • Prolonged states of crisis, inefficiency or frequent system failures (process plants, transportation systems, economies, telephone networks, etc.)
  • Sudden catastrophic collapse (spacecraft, aircraft, software systems, ecosystems, etc.)


Clearly, extreme problems cause extreme consequences and losses.

When it comes to man-made systems, bad design is often the cause. The inability of conventional science to embrace the systems perspective of things on the one hand, and neglecting their complexity on the other provide an efficient barrier to solving extreme problems.

Because in the majority of cases it is excessive and uncontrolled complexity that leads to severe consequences and extreme problems, Ontonix attacks them with its patented model-free Quantitative Complexity Management technology. In collaboration with supercomputer centers, Ontonix provides radically innovative means of formulating and solving extreme problems. We actually measure complexity, identify its sources, performing multi-level Complexity Profiling of systems and sub-systems until a solution is found. In huge and complex systems things often go wrong not because some parameters have the wrong value but because of the countless interactions that may develop. The higher the complexity, the more of such interactions may emerge. What this means is that thanks to high complexity the system in question possesses a marked capacity of producing surprises.

Extreme problems not only pose new challenges. They also stimulate innovative business models. In fact, when Ontonix takes on an extreme problem, the following scheme is followed:

  • The client and Ontonix analyze the problem together.
  • The consequences and losses incurred by the client are quantified.
  • As much data about the problem is gathered as possible.
  • Ontonix employs its best efforts to solve the problem.
  • In case of loss reduction/elimination, a percentage of the client´s gains are paid to Ontonix.




For more information e-mail us.


www.ontonix.com


 

Friday 24 January 2014

Health of the EU Economy - Stagnation or Recovery?


Our quarterly analysis of Eurostat's macroeconomic data of the Eurozone for Q3 2013 has now been completed and published. The interactive Business Structure Maps of each member state may be navigated here.

Italy, together with the UK, Sweden and France have the highest resilience (approximately 80%), while Belgium, Ireland, and Spain score a low 60%.

It is interesting to note that when it comes to the entire region, there are clear indications of a slow but consistent recovery. The evolution of complexity (its increase) in the plot belows shows that clearly.




However, complexity remains dangerously close to critical complexity, denoting alarmingly high fragility. This means that the system is very much exposed and incapable of dealing with intense shocks, financial contagion or extreme events. Nevertheless, it is also evident how, based on the available data, we have hit the bottom around Q4 2011, i.e. approximately two years ago. As of today, the situation in terms of complexity is comparable to that of Q3 2010. In essence, overall situation of the Eurozone has not evolved over the last three years. This is in line with an evident lack of reforms and lack leadership at both EU and country level.

The evolution of resilience follows a similar trend and, although it is still alarmingly low.




Finally, it is interesting to notice how in terms of recovery the 15 core Eurozone states are outpacing the 13 new  member states.




While the crisis peaked in Western Europe in Q4 2007, it climaxed approximately one year later in Central and Eastern Europe. In terms of recovery things are different. While the EU15 group touched the bottom in Q1 2011, the EU13 did so in Q3 2012, i.e. 18 months later. What is also clear is that the complexity gradient (higher complexity means a more lively economy) in the case of the EU13 group is substantially lower than that of the EU15. This means that, based on the currently available data, recovery in Central and Eastern Europe will be significantly slower.



www.ontonix.com                                       www.rate-a-business.com






Thursday 23 January 2014

Creating Fragile Monsters and When Failure Is An Option


Imagine the World as one big corporation, offering all sorts of products and services. If one observes the World, and recognises that most (if not all) things tend states of greater chaos and fragmentation, it may be difficult to reconcile both images. For example, the number of countries, is increasing. Look what happened to Yugoslavia and Czechoslovakia or the Soviet Union. Belgium and Spain will probably be next. Even the European Union itself is being questioned by growing numbers of its disillusioned citizens. The bottom line is that number of players is increasing, their demands and conflicting interests are going to be very difficult to deal with. There are centrifugal forces everywhere. Well, almost everywhere. In fact, as countries and societies tend to break up, there is an equally clear trend in the opposite direction in the corporate world. Consolidations are creating super-huge conglomerates of corporations and super-banks.

Let's look at consolidation in the US over the last two decades. First the media industry.






And the banking industry.





Such super-huge companies have been named as "Too Big To Fail" by Stewart McKinney when he  served on the Banking, Finance and Urban Affairs Committee in 1984. He was wrong. Super-huge companies and banks have failed and without early warning. Size, in this case, doesn't matter. The  problem, in fact, is not so much size as complexity. Excessive complexity to be precise. The new paradigm is "Too Complex To Survive". The enemy is excessive complexity. And why?


  • Highly complex systems are intrinsically hazardous systems.

  • Highly complex systems run in degraded mode.

  • Catastrophe is always just around the corner.


If we allow this mega-monster corporation to emerge, we need to be well aware of the three keywords appearing above: hazardous, degraded, catastrophe. Is this the world we want?


Today, complexity can be measured and managed. It is a fundamental and strategic Key Performance Indicator of any modern business. Ontonix is the first and only company to measure and manage complexity. Rationaly. Serious science starts when you begin to measure.



www.ontonix.com







Tuesday 21 January 2014

Just How Good Are Minimum-Complexity Portfolios?


In order to showcase the performance of minimum-complexity portfolios versus highly complex ones, two such portfolios have been built with stocks from the Dow Jones Index. A nasty period, which included the Internet Bubble, has been chosen: 2000-2004. We confronted the performance of both portfolios with that of the index itself. This has been done in four distinct periods.Here are the results.




In terms of numbers we have the following results:




While the Dow has reported a total loss of 4.2% during the entire 4-year period, the high-complexity portfolio produced gains of 6.3% and the low-complexity one an impressive 24.1%. In turbulence, simpler is better.

Minimum-complexity portfolios may be obtained at www.assetdynex.com







Sunday 12 January 2014

Just How Healthy is the US Economy?


We know that 2013 has been a great year for stock markets, US markets in particular. People are openly talking of "stock market recovery". We also know that the FED has been pumping paper into the system. But what has this really done to the economy, apart from increasing the values of market  indices?

Since Nature offers no free lunch (the economy probably doesn't either) printing money must have its consequence. If you make markets rally based on steroids you inevitably end up paying for it somewhere. We claim that such policies create fragility. Hidden fragility. Well, hidden to conventional pre-crisis analytics technology and to those that are concerned with numbers and numbers alone.

Assetdyne analyzes the major US markets every two weeks and publishes the results here. The focus of the analyses is resilience - the capacity to resist impacts, shocks, contagion, extreme events and, ultimately, sustained turbulence. The results are far from exciting, revealing mediocre levels of resilience. Here they are:


NASDAQ 100 - NDX

S&P 100 - OEX

Dow Jones Composite Average - DJA

Dow Jones Industrial Average - DJI

PHLX Semiconductor - SOX


A two to three-star resilience rating. Nothing to celebrate. The S&P 100, in particular, has an alarming two-star (64%) rating. We leave the comments to the readers.

Navigate Interactive Complexity Maps of the indices here. Just click on an index and move the mouse. More soon.



www.assetdyne.com






Monday 6 January 2014

Complexity Science Helps in Early-detection of Fibrillations and Tachyarrhythmia


The main goal of ONTONET-CRT™ is to reduce detection times as well as unnecessary ICD shocks. ONTONET-CRT™ adopts new model-free technology which does not rely on traditional math models. Instead of conventional analysis ONTONET-CRT™ processes the EGM and computes its complexity. Its sudden fluctuations generally anticipate events such as fibrillations or tachycardias.

ONTONET-CRT™ processing of EGM data indicates that fluctuations of complexity generally precede tachycardias or fibrillations. This means that it is possible to gain precious time in properly detecting and classifying the event and even preventing it altogether.

Analyses of EGMs shows that in over 80% of the cases ONTONET-CRT™ is able to anticipate commencement of tachycardias and fibrillations by a significant number of heart beats. This opens news avenues in terms of dealing with these events even before they commence.

Below is an example of how a sudden increase in complexity precedes a Ventricular Tachycardia.


Read more here.









Sunday 5 January 2014

Casino Capitalism: Legitimizing the Derivatives Soup.




In 2000, the Commodity Futures Modernization Act (CFMA) passed, legitimizing swap agreements and other hybrid instruments, a massive move towards deregulation and ending regulatory oversight of derivatives and leveraging that turned Wall Street more than ever into a casino. At the same time the first Internet-based commodities transaction system was created to let companies trade energy and other commodity futures unregulated, effectively licensing pillage and fraud. (Enron took full
advantage of this until it all imploded.)  Further, it launched a menu of options, binary options, forwards, swaps, warrants, leaps, baskets, swaptions, and unregulated credit derivatives like the now infamous credit default swaps, facilitating out-of-control speculation.

This deregulatory madness caused unprecedented fraud, insider trading, misrepresentation, Ponzi schemes, false accounting, obscenely high salaries and bonuses, bilking investors, customers and homeowners, as well as embezzling and other forms of theft, including loans designed to fail, clear conflicts of interest, lax enforcement of remaining regulatory measures, market manipulation and fraudulent financial products and massive public deception.

This slicing and dicing of risk-reducing derivative securities is still going on, creating a time bomb waiting to explode with catastrophic consequences.  According to the latest BIS statistics on OTC derivatives markets there was a whopping $693 trillion outstanding at the end of June 2013. That is more than 10 times the GDP of the entire world and equivalent to $100,000 for each of the 7 billion inhabitants of our planet.

The complexity and high potential risk associated with derivatives requires innovative risk assessment procedures and strong technical knowledge. There are tools to measure and monitor complexity of these financial products. One can be  found here.

With this innovative tool you can classify, rank and rate the complexity and resilience of derivatives, and establish maximum allowable levels of complexity and minimum allowable resilience. Products with low resilience contribute to making the system (economy) more fragile.  Once the most complex (dangerous) derivatives have been identified, they should be withdrawn progressively from circulation.


Submitted by Hans van Hoek



www.assetdyne.com
 

 
 
 

Saturday 4 January 2014

NASDAQ 100 Resilience Rating Analysis - January 2014, (1/2)

The first of two fortnightly NASDAQ 100 Resilience Rating Analysis reports in January 2014 is now available for downloading here.

The second January 2014 report shall be available after January 15-th.

The NASDAQ 100 Resilience Rating Analysis provides a ranking of the 100 stocks composing the index based on stock complexity and resilience. The report is offered free of charge.

Reports can be generated on a daily basis or in real-time. For more information contact us.








Which is the Most Reliable and Trustworthy Rating Agency?


One can never really trust a third party 100%. A lot has been written about the unreliability, lack of transparency and conflicts of interest of the Big Three Credit Rating Agencies. And yet, the entire economy uses and depends on ratings. It's a bit like those who smoke knowing that smoking causes cancer.

Even though the rating agencies have been said to be the "key enablers of the financial meltdown" ratings are necessary. Sure, ratings are necessary but they must be reliable and trustworthy. Because nobody is really 100% transparent and 100% independent, the term "reliable rating agency" sounds like an oxymoron. A radically new approach is needed.

The only person you trust 100% is yourself. So, if you want a 100% reliable rating, you must do it yourself. This is why we have built the "Rate-A-Business" platform, so that you can rate any business yourself. This is how it works:

1. If you want to rate a publicly listed company, you download its financials from its website and you process them at www.rate-a-business.com

2. If you want to rate your own business, you already have the financials. You use data you trust. You trust the result.


In the first case we still have the problem of  trusting the financials that public companies post on their Investor Relations pages. But, at least, the mechanism for rating those numbers which is used by Rate-A-Business remains the same. For everyone. All the time.

Ratings must be democratised. This means they must be in the hands of those who use them. They must become a commodity. Not a means of deception.



www.ontonix.com




Thursday 2 January 2014

Manipulation



Wall Street claims markets move randomly, reflecting the collective wisdom of investors. The truth is quite opposite. The government’s visible hand and insiders control them, manipulating them up or down for profit—all of them, including stocks, bonds, commodities and currencies. The public is none the wiser.

It’s brazen financial fraud like the pump and dump practice, defined as “artificially inflating the price of a stock or other security through promotion, in order to sell at the inflated price, then profit more on the downside by short-selling. This practice is illegal under securities law, yet it is particularly common, and in today’s volatile markets occurs daily to one degree or other. My career on Wall Street started out like this, in the proverbial "boiler room."

A company’s stock price and true worth can be highly divergent. In other words, healthy or sick firms may be way over- or undervalued depending on market and economic conditions and how manipulative traders wish to price them, short or longer term. During a trading frenzy, a stock price increases and so the capitalization of a company is suddenly more then just a few minutes or hours before? What non sense that is!

The idea that equity prices reflect true value or that markets move randomly (up or down) is nonsense. They never have and more than ever, don’t now. It is therefore crucial to circumvent the regular analysis hype, look at a company and find out the risk and complexity as a top analysis tool. There is no manipulation here, the data gives the company, stock or portfolio a face, and it is not a pokerface. The system developed by Assetdyne allows users to compute the Resilience Rating and Complexity of single stocks, stock portfolios, derivatives and other financial products.

Hans van Hoek
Partner at Assetdyne



www.assetdyne.com