Sunday 29 December 2013

NASDAQ 100 Resilience Rating Analysis


During 2014 Assetdyne shall be performing a Resilience Rating analysis of all of the NASDAQ 100 stocks. The report shall be offered free of charge and will be produced twice a  month.

In addition, Portfolio Complexity Maps shall be produced and made available for interactive navigation.

The first report is available here.

For more information on Assetdyne visit website.



www.assetdyne.com



Saturday 28 December 2013

Complexity and Battle Management


Modern battle scenarios involve a huge amount of data and information. According to Wikipedia:

"Network-centric warfare, also called network-centric operations or net-centric warfare, is a military doctrine or theory of war pioneered by the United States Department of Defense in the 1990s.
It seeks to translate an information advantage, enabled in part by information technology, into a competitive advantage through the robust networking of well informed geographically dispersed forces. This networking—combined with changes in technology, organization, processes, and people—may allow new forms of organizational behavior.
Specifically, the theory contains the following four tenets in its hypotheses:
  • A robustly networked force improves information sharing;
  • Information sharing enhances the quality of information and shared situational awareness;
  • Shared situational awareness enables collaboration and self-synchronization, and enhances sustainability and speed of command; and
  • These, in turn, dramatically increase mission effectiveness."
Now that complexity can be measured in real-time using the QCM engine OntoNet, we can take things to the next level: Complexity-Centric Warfare. The first step is to map the entire information flow obtained from a multitude of sensors onto a Complexity Map (before the enemy can trigger an EMP!). The map evidently changes in time as the battle evolves. The concept is illustrated below.



Clearly, sensors gather data about all the forces involved in a particular scenario. The combined map, showing two opposing forces, is illustrated below (clearly, an extremely simple example is shown). Experiments in Air Traffic Control conducted by Ontonix show that it is possible to track hundreds of airborne objects using radar and in real-time. A Massively Parallel Processing version of OntoNet (currently under development) will allow to process thousands of objects.


Once the maps are established, two issues of tactical character become evident:

  • Concentrate firepower on enemy hubs. 
  • Protect your own hubs.
Hubs are easily identified once a Complexity Map is available. A more sophisticated target ranking approach is based on battle Complexity Profiling, which allows to ranks the various actors based on their footprint on the entire scenario. Clearly, just as a Complexity Map changes with time so will the Complexity profile.

And now to strategic issues. How to manage a battle using complexity? Simple. Fast scenario simulation technology provides numerous options to chose from. And how do you chose between, say, two very similar options? You take the one with lower complexity. In other words, you  try to steer the conflict in a way that reduces its complexity. The concept is illustrated below.



A less complex battle scenario is easier to manage. It is easier to comprehend.It allows for faster decision-making. It is easier to be more efficient in a less complex situation than a highly complex one. Finally, highly complex situations have the nasty habit of suddenly delivering surprising behavior. And in the worst possible moment. Sounds like one of Murphy's laws.




www.ontonix.com




Tuesday 24 December 2013

Amazing what Complexity Technology Can do for Medicine.




Even though the so-called “complexity science” has been around for a few decades, it has failed to produce workable definitions and metrics of complexity. In fact, complexity still today is being seen as a series of phenomena of un-orchestrated self-organization (e.g. swarms of starlings), and emergence in which their complexity is never measured. In early 2005 the first Quantitative Complexity Theory (QCT) has been established by J. Marczyk. According to this theory, complexity is no longer seen as a process but as a new physical property of systems. Complexity, therefore, just like for example energy, is an attribute of every system. In nearly a decade, the QCT has found numerous applications is diverse fields. One of them is medicine.

Because modern science lacks a holistic perspective, favouring super-specialization, a patient is rarely seen and treated as multi-organ dynamic system of systems. Due to this cultural limitation and because of the overwhelming complexity of the human body, only on rare occasions is medical science quantitative.....



Read full White Paper.


Click here for an Interactive Complexity Map of an EEG.


www.ontomeds.com




Thursday 19 December 2013

Ontonix S.r.l.: How to Dismantle the Derivatives Time Bomb?

Ontonix S.r.l.: How to Dismantle the Derivatives Time Bomb?: From an article  on financial modeling: "Modeling derivatives is of particular importance due to the relative size of the de...

How to Dismantle the Derivatives Time Bomb?



From an article  on financial modeling:


"Modeling derivatives is of particular importance due to the relative size of the derivative market when compared to the real economy. If we examine the Bank of International Settlements (BIS) estimate of “Amounts outstanding of over-the-counter (OTC) derivatives” in December 2010, this amounted to US$601,046 billion. To the World Bank estimate of World Gross Domestic Product (GDP) for 2010, the volume of financial transactions in the global economy is 73.5 times higher than nominal world GDP.




In 1990, this ratio amounted to “only” 15.3. Transactions of stocks, bonds, and foreign exchange have expanded roughly in tandem with nominal world GDP. Hence, the overall increase in financial trading is exclusively due to the spectacular boom of the derivatives markets. In the final analysis, the mathematical modeling of this system of obligations is an imperative for the world economy’s well-being."

Basically, what this means, is that derivatives have engulfed our economy and our very existence relies on trading robots, stochastic integrals and Brownian motion. A very fragile situation which nobody today is able to grasp or control. The system is running on autopilot and nobody know how the autopilot works.

Now, we all know that one salient characteristic of derivatives is their high complexity. This is because they have deliberately been designed to be complex. There are numerous reasons why one would want to design a very complex financial product. One of them is to fool investors. However, there are very important implications deriving from the immission of highly complex financial products into the global economy:

1. Highly complex products have highly complex dynamics which are difficult to capture with conventional math methods. Monte Carlo, VaR, etc. are all techniques that not only belong to the past, they have contributed significantly to the crisis. This means they cannot be used to find a cure. If smoking causes cancer, smoking more will not make it go away.

2. A product may be said to be complex, at a given moment in time, but, precisely because of the complex dynamics of derivatives, this complexity is never constant. It changes.

3. If a product is said to be complex, it means that someone must have measured its complexity. Otherwise, how can such a claim be sustained? Serious science starts when you begin to measure.

4. The biggest problem with derivatives is that of their rating. Since their real dynamics is essentially unknown (or deliberately masked), attempting to rate them is futile. This is where the Credit Rating Agencies failed when they triggered the financial meltdown. On the one hand they assigned investment-grade ratings to products which where known to be toxic, on the other hand their outdated methods of rating were simply not applicable to super complex financial products.

This brings us to the main point of this article. Our economy looks more or less like this:



and we need to fix it before the system collapses. As you read this short article, every minute billions are being traded in hyperspace and the pile in the picture is growing. What can be done? There is no simple recipe. However, what must be done is this:

1. Start to measure and monitor the real complexity of the financial products that are out there. There exist tools today to do this. One is here.

2. Classify, rank and rate the complexity and resilience of derivatives. Establish maximum allowable levels of complexity and minimum allowable resilience of financial products. Products with low resilience contribute to making the system (economy) more fragile.

3. Once the most complex (dangerous) derivatives have been identified, they should be withdrawn progressively from circulation. 


More soon.



www.assetdyne.com














Tuesday 17 December 2013

Superior-Performance Portfolio Design via Complexity Theory


Assetdyne LLC is a privately held company founded in 2013. Assetdyne has developed the Complexity Portfolio Theory (CPT) and offers an exclusive and sophisticated system which measures the complexity and resilience of stocks and stock portfolios and which introduces the concept of complexity to portfolio theory and design.

While conventional portfolio design often follows the Modern Portfolio Theory (MPT), which identifies optimal portfolios via minimization of the total portfolio variance, the technique developed by
Assetdyne designs portfolios based on the minimization of portfolio complexity. The approach is based on the fact that excessively complex systems are inherently fragile 


See how low-complexity portfolios perform better.

Downlod the full presentation here.



www.assetdyne.com



Friday 13 December 2013

First Complexity-based Portfolio Design System



Assetdyne introduces Quantitative Complexity Science to portfolio analysis and design. Disruptive innovation in finance in its purest form. Check out the new website 


See interactive examples of stock portfolios.




Monday 9 December 2013

Some Financial Products Are Said to be Complex. But How Complex is That?





Assetdyne offers an on-line tool which allows users to measure the complexity and resilience of a single security or a portfolio. The tool is connected in real-time to US markets and allows to monitor any security listed therein.

The tool allows investors to answer the following questions:

  • How complex is a portfolio? 
  • How complex is a financial product, such as a derivative?
  • What is the maximum complexity a portfolio can reach?
  • How resilient is it? How well can it resist the market turbulence?
  • What does the portfolio structure look like?
  • How interdependent are the stocks composing the portfolio?
  • Which stocks actually dominate portfolio dynamics?
  • How well can the dynamics of the portfolio be predicted?

See examples of portfolios (and their complexities) - click on a portfolio to open an interactive Portfolio Complexity Map:

Automotive

Steel

Gold Mining

Oil & gas

Pharmaceutical

IT Industry

EU Banks

US banks

Dow Jones



www.assetdynex.com




Tuesday 3 December 2013

OntoMed Launches OntoNet-CARDIO for Real-Time Anticipation of Tachycardias and Fibrillations


OntoMed, a privately-owned company developing leading-edge complexity-based technology and solutions for applications in medicine, releases OntoNet-Cardio, an advanced algorithm which processes EGMs (Electrograms) providing early-warnings of imminent tachycardia and/or fibrillation. "The algorithm does not detect tachycardias or fibrillations, it anticipates them by identifying pre-event conditions as early as 15-20 seconds before they actually happen" said Dr. J. Marczyk, the CEO of OntoMed. "The principle on which OntoNet-Cardio functions has been verified in a multitude of fields and applications and is based on sudden variations of complexity which precede traumatic events" he added. "We are open to partnerships and collaborations with ICD/pacemaker manufacturers who are interested in incorporating our technology in their products" he concluded.

The image below illustrates how the system functions.



Read full Press Release here.



www.ontomeds.com



Monday 25 November 2013

Modeling Risk and Model-Induced Risk


Models are simplified representations or emulators of reality. They are typically based on empirical data collected conducting experiments. There exist numerous techniques for establishing models based on raw field data. An example of a linear model, built based on 2-dimensional data is illustrated below.


The model in this simple case is a straight line.  It may be used, for example, to compute the value of one variable when the other is given. This is particularly useful for values which fall in between the raw data used to build the model.

A more interesting case is illustrated below. Here too the data lie on a straight line. However, there is a void between the two groups of data points. This poses a problem. In cases such as this one a single model is built, passing through both groups of points. In other words, a most dangerous assumption is made: that of continuity. This is a very common mistake. The mere fact that all the points in question lie on a straight line does not guarantee that in the void between the two groups the line still constitutes a valid model.





The mistake proves often fatal, and more so when the domain one wishes a particular model to embrace is more articulated. In the above case, there should be two local models, not one.

Models should be used with caution and ONLY in the domains where they have been validated. Extending the usage of a model beyond such a domain has consequences which grow with the model's complexity. Moreover, when a model's applicability is stretched, very seldom the effects are actually measured. The actual equations might still work but that does not guarantee that the model isn't violating some basic laws or rules.

The bottom line:

  1. Models should be used only in the domain where they have been validated.
  2. More complex models require more assumptions and therefore induce more uncertainty into the original problem than simpler models.
  3. The more complex a domain (in topological sense) that a model has to embrace, the easier it is to violate some basic laws with it. And it is more difficult to spot them.
  4. The most frequent and deadly malpractice when models are involved is the assumption of continuity. Nature is not "smooth and differentiable".
  5. The additional risk deriving from the fact that a surrogate of reality is used is rarely measured.

One more rule: the most important things in a model are those it doesn't contain.




www.ontonix.com





Sunday 10 November 2013

Rating a Company Based on Its Balance Sheet AND Stock Markets


Traditional rating of corporations is based on analysis of financials statements such as Balance Sheets, Cash Flows and Income Statements. Traders, on the other hand, look at stock performance, analyze trends, make projections. Often there is no time to look at Balance Sheets as things happen at Internet speeds. While trading takes place a thousand times per second, traditional rating is performed once a year when the Consolidates Balance Statement is published.

Recently, London-based Assetdyne has established a new form of Resilience Rating based exclusively on the performance of a company's stock - a radically innovative high-frequency rating mechanism.

Ontonix has recently come up with a new rating methodology which brings together both worlds - Integrated Resilience Rating. Basically, we blend quarterly financial statements with quarterly stock market performance. This is what it looks like in the form of an integrated Business Structure Map:


The first 13 nodes  correspond to stock market-specific information. The remainder come from a Balance Statement. The Complexity Profile below - which provides a quantitative and natural ranking of the relevance of each parameter in the map - shows in the case in question "Divided" is as important as, for example "Assets" - both have a footprint of approximately 7% on the Integrated Resilience Rating.




More examples of Integrated Resilience Ratings are available here.


Quarterly Integrated Resilience Rating is now offered to public companies on a subscription basis. For more information contact us.









Sunday 3 November 2013

Top EU Versus US Banks and Which System is More Vulnerable.


Which system of banks is more robust, that of the US or the European one? A recent analysis performed by ASSETDYNE using stock market data reveals that the US market is far less vulnerable in case of contagion. Here are the results.

Top US Banks.


A 4-star rating (84%) points to a resilient situation.

Top EU Banks.


Again, a 4-star rating (85%) reveals high resilience. So why is the US system potentially less vulnerable? Look at the map densities. The US Complexity Map has a density of 32%, that of the EU banks 76%, far more than double. This means in case of financial contagion, the EU system of banks is far more vulnerable that the American one. This is also evident if one examines the size of the nodes in each map. In the case of the US, there are 3, 4 dominant nodes (hubs) while in the EU almost all banks have equal footprint. Of course, we're not talking of revenue - by footprint we mean the impact of each bank on the overall resilience of the system. The above results also means that the EU system is far more difficult to reform than the US banking system. It is alo much more complex (39.37 versus 25.28)  Providing one wants to reform it. And financial lobbies allowing it.



www.assetdyne.com



How Complex is an Ulam Spiral


From Wikipedia: "The Ulam spiral, or prime spiral (in other languages also called the Ulam Cloth) is a simple method of visualizing the prime numbers that reveals the apparent tendency of certain quadratic polynomials to generate unusually large numbers of primes. It was discovered by the mathematician Stanislaw Ulam in 1963..."

How complex is it? Here goes the Complexity Map of the above image:


These are the corresponding complexity measures:





The high robustness of the image - 76.8% - means that its structure is relatively strong. This means, for example, that the image may be de-focused and still transmit most of the information it contains.

The amount of information the image transmits is nearly 154 bits.



www.ontonix.com



Monday 21 October 2013

Resilience Ratings of 28 EU Member States Q2/2013



The updated EU28 Resilience Ratings are now available. Just click on the corresponding icon and navigate the interactive Business Structure Map.

   
     
   
       
     
 
     


The Robustness (Resilience) ranking is as follows:





Data source: Eurostat.


Resilience Ratings performed using www.rate-a-business.com



www.ontonix.com