Friday, 13 December 2013
Monday, 9 December 2013
Assetdyne offers an on-line tool which allows users to measure the complexity and resilience of a single security or a portfolio. The tool is connected in real-time to US markets and allows to monitor any security listed therein.
The tool allows investors to answer the following questions:
- How complex is a portfolio?
- How complex is a financial product, such as a derivative?
- What is the maximum complexity a portfolio can reach?
- How resilient is it? How well can it resist the market turbulence?
- What does the portfolio structure look like?
- How interdependent are the stocks composing the portfolio?
- Which stocks actually dominate portfolio dynamics?
- How well can the dynamics of the portfolio be predicted?
See examples of portfolios (and their complexities) - click on a portfolio to open an interactive Portfolio Complexity Map:
Oil & gas
Tuesday, 3 December 2013
OntoMed, a privately-owned company developing leading-edge complexity-based technology and solutions for applications in medicine, releases OntoNet-Cardio, an advanced algorithm which processes EGMs (Electrograms) providing early-warnings of imminent tachycardia and/or fibrillation. "The algorithm does not detect tachycardias or fibrillations, it anticipates them by identifying pre-event conditions as early as 15-20 seconds before they actually happen" said Dr. J. Marczyk, the CEO of OntoMed. "The principle on which OntoNet-Cardio functions has been verified in a multitude of fields and applications and is based on sudden variations of complexity which precede traumatic events" he added. "We are open to partnerships and collaborations with ICD/pacemaker manufacturers who are interested in incorporating our technology in their products" he concluded.
The image below illustrates how the system functions.
Read full Press Release here.
Monday, 25 November 2013
Models are simplified representations or emulators of reality. They are typically based on empirical data collected conducting experiments. There exist numerous techniques for establishing models based on raw field data. An example of a linear model, built based on 2-dimensional data is illustrated below.
The model in this simple case is a straight line. It may be used, for example, to compute the value of one variable when the other is given. This is particularly useful for values which fall in between the raw data used to build the model.
A more interesting case is illustrated below. Here too the data lie on a straight line. However, there is a void between the two groups of data points. This poses a problem. In cases such as this one a single model is built, passing through both groups of points. In other words, a most dangerous assumption is made: that of continuity. This is a very common mistake. The mere fact that all the points in question lie on a straight line does not guarantee that in the void between the two groups the line still constitutes a valid model.
The mistake proves often fatal, and more so when the domain one wishes a particular model to embrace is more articulated. In the above case, there should be two local models, not one.
Models should be used with caution and ONLY in the domains where they have been validated. Extending the usage of a model beyond such a domain has consequences which grow with the model's complexity. Moreover, when a model's applicability is stretched, very seldom the effects are actually measured. The actual equations might still work but that does not guarantee that the model isn't violating some basic laws or rules.
The bottom line:
- Models should be used only in the domain where they have been validated.
- More complex models require more assumptions and therefore induce more uncertainty into the original problem than simpler models.
- The more complex a domain (in topological sense) that a model has to embrace, the easier it is to violate some basic laws with it. And it is more difficult to spot them.
- The most frequent and deadly malpractice when models are involved is the assumption of continuity. Nature is not "smooth and differentiable".
- The additional risk deriving from the fact that a surrogate of reality is used is rarely measured.
One more rule: the most important things in a model are those it doesn't contain.
Sunday, 10 November 2013
Traditional rating of corporations is based on analysis of financials statements such as Balance Sheets, Cash Flows and Income Statements. Traders, on the other hand, look at stock performance, analyze trends, make projections. Often there is no time to look at Balance Sheets as things happen at Internet speeds. While trading takes place a thousand times per second, traditional rating is performed once a year when the Consolidates Balance Statement is published.
Recently, London-based Assetdyne has established a new form of Resilience Rating based exclusively on the performance of a company's stock - a radically innovative high-frequency rating mechanism.
Ontonix has recently come up with a new rating methodology which brings together both worlds - Integrated Resilience Rating. Basically, we blend quarterly financial statements with quarterly stock market performance. This is what it looks like in the form of an integrated Business Structure Map:
The first 13 nodes correspond to stock market-specific information. The remainder come from a Balance Statement. The Complexity Profile below - which provides a quantitative and natural ranking of the relevance of each parameter in the map - shows in the case in question "Divided" is as important as, for example "Assets" - both have a footprint of approximately 7% on the Integrated Resilience Rating.
More examples of Integrated Resilience Ratings are available here.
Quarterly Integrated Resilience Rating is now offered to public companies on a subscription basis. For more information contact us.
Sunday, 3 November 2013
Which system of banks is more robust, that of the US or the European one? A recent analysis performed by ASSETDYNE using stock market data reveals that the US market is far less vulnerable in case of contagion. Here are the results.
Top US Banks.
A 4-star rating (84%) points to a resilient situation.
Top EU Banks.
Again, a 4-star rating (85%) reveals high resilience. So why is the US system potentially less vulnerable? Look at the map densities. The US Complexity Map has a density of 32%, that of the EU banks 76%, far more than double. This means in case of financial contagion, the EU system of banks is far more vulnerable that the American one. This is also evident if one examines the size of the nodes in each map. In the case of the US, there are 3, 4 dominant nodes (hubs) while in the EU almost all banks have equal footprint. Of course, we're not talking of revenue - by footprint we mean the impact of each bank on the overall resilience of the system. The above results also means that the EU system is far more difficult to reform than the US banking system. It is alo much more complex (39.37 versus 25.28) Providing one wants to reform it. And financial lobbies allowing it.
From Wikipedia: "The Ulam spiral, or prime spiral (in other languages also called the Ulam Cloth) is a simple method of visualizing the prime numbers that reveals the apparent tendency of certain quadratic polynomials to generate unusually large numbers of primes. It was discovered by the mathematician Stanislaw Ulam in 1963..."
How complex is it? Here goes the Complexity Map of the above image:
These are the corresponding complexity measures:
The high robustness of the image - 76.8% - means that its structure is relatively strong. This means, for example, that the image may be de-focused and still transmit most of the information it contains.
The amount of information the image transmits is nearly 154 bits.