Thursday, October 09, 2008

Why Does a AAA business Suddenly Fail? The Lehman Brothers case.


Recent developments in the international financial arena show how conventional risk rating and risk management tools and techniques prove inefficient in a highly turbulent and globalized economy. Sudden default of AAA companies is a fact which has been observed on more than one occasion in the past few months. The cause of these failures may be attributed to a substantial increase of complexity without the management's knowledge. It is known that excessive complexity may in fact  "drown" a business if it is unprepared. It is excessive complexity that is the ultimate source of risk. Therefore, it is mandatory to measure and manage complexity. The Lehman Brothers bank is an emblematic case. The bank did not know it was going to default. And yet, rating agencies gave it a very high rating - below is the rating reported from their website (relative to January 2008):

So why does a bank fail without early warning? The answer is hidden complexity. Complexity measures the amount of structured information. What this means in practice is that it quantifies how intricate and noisy the business is. Applied to accounting numbers, it directly measures how difficult it is to run the business. Variables are interconnected (not always in a nice and linear fashion) and the interconnections are fuzzy, not crisp. Now, what happens when complexity increases and the accountants (and managers) don't know about it? They keep on using the same models (often based on a bell-curve perspective) which prove less and less efficient. Their predictions will become less credible, it will be increasingly more difficult to reach goals and to make the numbers. If that were not enough, banks trade extremely sophisticated derivate-based financial tools which have tentacles reaching out to the most remote corners of the global economy. Often, the bank doesn't know how much it owes and how much others owe them. Entropy (uncertainty) is everywhere and no Black-Scholes model can get it right. The world is irrational and definitely not Gaussian. Our complexity metric captures all this in a clean and natural fashion as the method we use is model-free. There are no assumptions to make, no exotic models to build. We extract the hidden complexity measure that sits in the real numbers that reflect the actual business.

Of course, the increase in complexity does have real causes. Human behavior and habits for example - see here. The point, however,  is to provide corporations with an effective early-warning capability that really works. Our complexity-based system OntoSpace is a great way to start. We have used it (in a post-mortem fashion) on the sub-prime crisis, and, recently, on the Lehman Brothers bank case. The results are striking. As expected, complexity and entropy have been rising quickly over the period 2004-2007, as illustrated in the two Complexity & Risk Maps maps below:

2004 Complexity & Risk Map

 2007 Complexity & Risk Map

Over a period of 4 years, complexity more than doubles. Entropy nearly doubles, as the alert robustness drops from 82% to 72%. The evolution of complexity, entropy and robustness (state of health) are depicted below. The increase of entropy is dramatic. The system build up "hidden energy" which it must dump, sooner or later. Conventional tools are unable to predict such phenomena.

 

Evidently, contemporary risk models are inefficient in a turbulent economy. It is not a question of sophisticated mathematics, which inevitably adds unnecessary uncertainty to an already uncertain situation. It is a matter of changing tools and methodologies. Because hidden complexity is the manager's main enemy, complexity must be measured and managed. In the near future, risk management will become complexity management. No doubt.



No comments: