Practical Complexity Management, Part II
PRACTICAL COMPLEXITY MANAGEMENT, Part II J. Marczyk PhD
ONTONIX PUBLICATIONS 2012 Copyright, © 2012 J. Marczyk
1
Practical Complexity Management, Part II
PRACTICAL COMPLEXITY MANAGEMENT, Part II
Jacek Marczyk PhD
Copyright, © 2012 J. Marczyk
2
Practical Complexity Management, Part II
PRACTICAL COMPLEXITY MANAGEMENT, Part II Jacek Marczyk PhD
ONTONIX PUBLICATIONS First published in 2012 ISBN 978-88-97260-06-6 Copyright © 2012 Jacek Marczyk. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means without the prior written permission of the author. Write to
[email protected] for further information or visit www.ontonix.com Copyright, © 2012 J. Marczyk
3
Practical Complexity Management, Part II
ABOUT THE AUTHOR Dr. Marczyk has thirty years of experience in large-scale simulation in diverse sectors of the industry. He holds an MS in Aeronautics Engineering (Politecnico di Milano) MS in Aerospace Engineering (Politecnico di Torino) and a Ph.D in Civil Engineering (Universidad Politecnica de Catalunya). He has pioneered innovative methodologies for uncertainty and complexity management, publishing seven books on stochastics, nonlinear mechanics and complexity management. In 2005 he founded Ontonix, the World’s first company to develop and commercialize a software system for quantitative complexity management. In 2009 he founded US-based OntoMed, a company which uses complexity technology to measure the fragility and stability of hospitalized patients. Holder of a US Patent, Dr. Marczyk has developed award-winning software tools such as ST-ORM, MSC Robust Design, OntoSpace and OntoTest. During his career he has worked for EADS/CASA, BMW AG, Centric Engineering Systems, ESI, Silicon Graphics, Tecnomare, EASI, and MSC Software.
ABOUT THE BOOK "Practical Complexity Management Part II" is an inter-disciplinary account of how a recently developed concept of complexity may be used in diverse fields, ranging from economics to engineering, from social sciences to medicine. The text is unique in that it provides numerous and concrete examples of actually measuring and rationally managing complexity for generic systems. The concepts conveyed in the book have been developed by Ontonix, a company pioneering a quantitative approach to complexity and its effective management. The book is a collection of blogs published on the corporate website www.ontonix.com.
Copyright, © 2012 J. Marczyk
4
Practical Complexity Management, Part II
CONTENTS 9
INTRODUCTION 1.
Global Warming and the Meltdown of the Economy
10
2.
The Present and Future of the Concept and Value of Ratings
13
3.
Complexity: The Fifth Dimension
19
4.
A New Theory of Risk
22
5.
If You Really Need to Optimize ....
25
6.
“Don’t Cross a River Because it is On Average 4 feet Deep”
28
7.
Complexity: The Fifth Dimension (II)
30
8.
Stability, Not Growth
31
9.
The Crisis Will End in 2010. The FED Says.
32
10.
Beyond the Concept of Risk. New Paradigms in Turbulent Times
33
11.
Correlation, Regression and how to Destroy Information
35
12.
Toxic Financial Assets are Inevitable in a Complex Economy
37
13.
TBTF - Too Big To Fail
38
14. More on Correlations and Causality
40
15.
How Nature Works. Patterns, Not Details
42
16.
How do You Measure the Impact of a Company on a Market?
45
17.
On Critical Complexity
47
18.
Do We Really Understand Nature?
49
19. Measuring the State-of-Health of Corporations - On New Rating Schemes
51
20.
The Five Dimensions of Business
53
21.
The Corporate Complexity Profile - Closing the Loop on Complexity
55
22.
Rating Markets and Economies
57
23. Stress Testing in Real-Time Gross Settlement (RTGS)
58
24. Stress-Testing and Crisis Anticipation Using Complexity
61
25.
Recovering from the Crisis with Conventional Techniques. An Oxymoron?
65
26.
Eco-friendly Products and Product Complexity
67
27.
Entropy and the Weather
69
28.
Is it Progress if a Cannibal Uses Knife and Fork?
71
29. Survival is the Only Measure of Success
73
30.
Recovery? The Economy Will Never Be the Same Again
74
31.
Another Lehman Collapse?
75
32. Man-made Crises Outpacing Our Ability to Deal With Them
76
33.
A Second Sub-prime Bubble Approaching?
77
34.
Rating Stratification: How Many Rating Classes?
80
35.
Is Risk Management a Source of Risk?
81
36.
The True Source of Risk
82
37.
Lies, Damn Lies, Statistics & Risk Management
85
38.
A Risk Management Strategy Cannot be Verified
87
39.
Are Risk Strategies Verifiable?
88
40.
The Principle of Fragility
90
41.
Turbulence, the New World Order
91
42. Measuring the Magnitude of a Crisis
93
43.
94
The Present and Future of Risk Management and Rating Copyright, © 2012 J. Marczyk
5
Practical Complexity Management, Part II 44.
How to Make the Global Economy Less Fragile
95
45.
New book: "Interdependency"
97
46.
Will Making the Supply Chain Leaner Actually Reduce Complexity?
98
47.
Taking a Holistic Look at Markets, Funds and Portfolios
100
48.
How do You Rate the Structure of a Business?
104
49.
Why is Sensitivity Analysis Dangerous?
106
50.
Turbulence and Economic Cycles. An Oxymoron?
108
51.
Credit Rating Agencies Under Fire
110
52.
The Enemy is Not Powerpoint. It is Complexity
111
53.
What is the Invoice Price of a AAA rating?
112
54.
How Fragile is the Structure of Euroland's Economy?
114
55.
Rating the Ratings
117
56.
Democratizing Ratings
120
57.
Is it Possible to Make Predictions?
122
58.
High Complexity and Fragility
124
59.
Could Euroland's Economic Woes Have Been Anticipated?
126
60.
Too Big to Fail? No, Too Complex to Survive.
128
61.
When Has the 2007 Crisis Really Begun?
130
62.
Which EU Countries Are the Most Complex?
132
63. Measuring Intangibles. An Oxymoron? Not Really.
135
64.
Huge Projects: Too Big To Fail or Too Complex To Succeed?
137
65.
Optimal Does NOT Mean Best
138
66.
The Fragility of the UE Economy Rated VERY HIGH
140
67.
Will the Euro Survive? Will the EU Survive?
141
68.
Complexity: A Link Between Science and Art?
145
69.
A Global First - Web-based Self-Rating For Businesses
148
70.
Beyond Pre-Crisis Analytics
150
71. Making Predictions Based on Murphy's Laws
152
72.
Complexity - A Meta-KPI
155
73.
How Robust is the Current Geopolitical Scenario?
158
74.
Complexity - A Critical Success Factor in Running a Business
161
75.
Is the EU Economy Running At Two Speeds?
163
76. More Trouble for Greece?
165
77.
167
Driving Complexity: Application to Portfolio Design
78. Measuring Improvement & Benchmarking Images
170
79.
Benchmarking Air Traffic
172
80.
How Do You Cook a Global Economic Crisis?
174
81.
An Objective Look at the EU Economy. What Are Rating Agencies Up To?
177
82.
Is the Chinese Economy Really So Healthy?
181
83.
Can Simulation Technology Really Help Companies Do Business?
183
84.
The World is Becoming Not Only More Complex But Also Significantly More Uncertain
186
85.
Are Complex Businesses More Fragile?
188
86.
Is This Really an Economic Crisis?
191
87.
US Downgraded by ONE Agency Out of Three: What Does This Say About Ratings?
192
88.
Traditional Ratings, Traditional Risk Management: Is it Over?
194
89.
What is Critical Complexity?
197
90.
Is the Economic Crisis Turning Into a Social Crisis?
200
Copyright, © 2012 J. Marczyk
6
Practical Complexity Management, Part II 91.
Can Increasingly Complex Systems Be Engineered Without Taking Complexity Into Account?
203
92.
How Robust is the Structure of the EU Economy Today?
212
93.
What Is The Relevance of Probability?
213
94.
Reality is Made of Loops But All We See Is Straight Lines
215
95.
Ratings - From an Opinion to Science
218
96.
Rating the Rating Agencies - We've Rated Moody's
220
97.
Democratizing Ratings
225
98.
Keynes vs Friedman - What is the Mix Today?
227
99.
In Math We Trust. That is Precisely the Problem.
229
100. A Structured Look at Cellular Automatons
231
101. EU Resilience Rated by Ontonix at "Very Low" in November 2011
237
102. When Will the Euro Collapse? Around Q3 2013
239
103. The EU Running At Two Speeds?
242
104. The Collapse of the Euro: Conspiracy or Bad Design?
244
105. Complexity? It's All Relative
246
106. In a Global Economy Meltdown Can There be AAA-rated Countries?
248
107. Can Simulation Technology Really Help Companies Do Business?
250
108. What is the Real Cause of Europe's Troubles?
253
109. Identifying Hidden Structure (in Data) and Computing Knowledge
255
110. How To Avoid the Next (Worse) Crisis. Democratize Ratings
258
111. Should Europe Have Its Own Rating Agency?
262
112. The Dynamic Properties of Complexity and Business Resilience
264
113. Does the US Deserve the Downgrade? Is It Really a AAA Country?
268
114. How to Check If a Company Is Investment-Grade
271
115. When Will the USA Collapse? Around 2018
273
116. Does a Nobel in Economics Still Make Sense?
276
117. The EU and US Are Headed Towards Collapse. And China, Russia?
277
118. How Much Globalization Can The World Afford?
281
119. Conventional Ratings Versus Resilience Rating
286
120. The Fragmentation (and Crisis) of Business Analytics
288
121. High Complexity. Is it Good or Bad?
290
122. Black Tuesday (and Friday) in EU Stock Markets and the Ontix Meta-Index
292
123. Things That Cannot Be Modelled Shouldn't be Modelled
294
124. Measuring the Impact of Employees On the Balance Sheet
296
125. Measuring Stress and Alignment In Your Organization
300
126. A Closer Look at the PIIGS
304
127. Why Risk Management Needs a Big Overhaul
306
128. PIIGS or PIGS?
308
129. Beyond Risk Management. What Are the Alternatives?
309
130. High-frequency Rating: An Early-Warning System for a Turbulent Economy
312
131. High-Frequency Rating: Measuring Resilience in Real-Time
314
132. Germany and Greece - Different Economies, Same Resilience
315
133. What Future For The Euro And For Europe?
319
134. Coping With Turbulence: More Theories and Math?
322
135. Probability of Default Versus the Principle of Incompatibility
325
136. Does Optimal Mean "Best"? Not Really
327
137. The US Resilience Rating At 1-Star
329 Copyright, © 2012 J. Marczyk
7
Practical Complexity Management, Part II
Copyright, © 2012 J. Marczyk
8
Practical Complexity Management, Part II
INTRODUCTION Since the publication of Part I of Practical Complexity Management in 2009, much has changed in the international arena. The world, as a system, is becoming dangerously complex and increasingly uncertain and the solution to the global economy crisis is still far from imminent. However, even though complexity and complexity management are becoming more and more fashionable, complexity is still being treated as something elusive, difficult to capture, impossible to measure and even to define. A multitude of consulting firms are offering “complexity management” services but without actually measuring it. How they do it is a mystery. It’s like going on a diet but not being able to measure your weight in order to verify the diet’s effectiveness. And yet, serious science starts when you begin to measure. Ontonix is the first and today the only firm offering technology and solutions for a Quantitative Complexity Management. The company has developed a comprehensive suite of software products and services which actually measure complexity, identifying its sources and providing quantitative solutions for its control and rational management. Our approach to complexity is based on the fact that we recognize complexity as an attribute of any system. For us, complexity is not a phenomenon that takes place spontaneously on the adge of chaos. It is precisely this approach to complexity, professed by the so-called “complexity science” that makes it impossible to even define it, measure it or, for that matter, to turn it into a profitable business. Numerous complexity centers around the world practice “complexity science” and speak of a “complexity theory” but without having ever produced a measure thereof. This is most surprising given that complexity has become the major limiting factor of sustainable development and, at the same time, a formidable source of fragility of the global economy and of society. Its incorporation in management and governance is a must if we are to look to the future with optimism. Conventional techniques of risk assessment, management, business intelligence and analytics in general prove inefficient in a turbulent regime, as reflected so eloquently in the current economy meltdown. Quantitative Complexity Management, on the other hand, is a ground-braking and superior form of Business Intelligence which is particularly suited for uncertainty-dominated ecosystems and in which extreme events are the rule rather than the exception. QCM technology becomes a new platform on which to build radically innovative approaches and solutions to wide variety of problems, ranging from corporate strategy to resilience rating of corporations, from business robustification to asset portfolio design or IT system management. The scope of this book, which is a collection of blogs from our corporate website – www.ontonix.com/Blog – is to provide a provocative overview of the numerous applications QCM technology finds in a wide range of fields. The blog compilation spans the period 2009 – 2012.
Como, August 2012.
Copyright, © 2012 J. Marczyk
9
Practical Complexity Management, Part II
1. Global Warming and the Meltdown of the Economy
Finance gurus and economists have been unable to predict the meltdown of the economy. It is obviously extremely difficult to predict that a single company will default, especially if it wants to hide the truth. However, in the case of the current financial and economic crisis we're talking of a devastating phenomenon of planetary proportions. We're talking of the entire global economy. Much has already been written about the multi-facted and concurrent causes of the meltdown and we do not wish to add more argumentation to the discussion. The point is that banks, financial institutions, traders, brokers, economists, think-tanks, research centers, as well as the academia, had all the available instruments, the brains, risk models, supercomputers and yet nobody was able to hint what was coming, when it was coming and what where the proportions. Not to mention the consequences. Let's draw a parallel. We're being told (mainly by actors and politicians) that global warming is actually taking place and that the irreversible consequences will be devastating. Can we believe this to be true? There certainly are many analogies between a global economy and a biosphere (of which the climate is one part) in constant evolution. A biosphere goes through periods of calm, but, in general, it is punctuated by unique and non-repeatable catastophic events, such as extinction spasms, earthquakes, tsunamis, floods or vulcanic eruptions. None of which are of course predictable. The same may be said of the economy. It too suffers recessions, depressions, wars (on local or global scale) crises, scandals, etc. None of these are predictable. In effect, in turbulent environments, governed by chaos and full of "vortices" (in a general meaning of the word), predictability doesn't exist. No model will ever be able to go beyond the capture of basic patterns. OK, we understand the climate to a high degree but e cannot predict if it will rain tomorrow or not. One thing is to understand the seasons, the other is to predict the weather. In economy one can basically make very similar statements. We understand the dynamics of global trade, finance, or the stock markets, but nobody was able to predict the collapse of a giant such as Lehman Brothers. It is nobody's fault that similar events cannot be captured by statistical models. It is, however, very dishonest to say that mathematical (exotic) models can predict them. So, the bottom line is: a turbulent atmosphere seems to share many analogies with a turbulent economy. Having drawn the parallel, one may ask, at this point, the following question: if humanity, with all its resources, has been unable to predict the coming of this huge financial and economic crisis, can anyone seriously claim that the current global warming (which remains to be verified) is human-induced? In other words: Copyright, © 2012 J. Marczyk
10
Practical Complexity Management, Part II
We are still not sure if there is on-going global warming In case there is, are we sure it is human-induced?
In the case of global warming, there are at least two camps: those who sustain the idea of global warning and those who don't. In the case of the economy meltdow, practically nobody hinted the crisis. There was essentially only one camp: all of us. So now we get climate gurus talking of global warming. The driving idea of this short article is not to favour or to dismiss the idea of global warming. The point is to draw attention as to how risky the usage of models is, especially in realms of instability, chaos and uncertainty (it is important to remark that chaos is a deterministic phenomenon and has nothing to do with randomness). An interesting article on global warming may be found here. The main points of the article are as follows (we report bullets 1, 2, 3 and 7): "1. Contrary to the popular belief that glaciers all over the world are melting, in some places they are actually GROWING. In Iceland and Greenland, the first half of the twentieth century was warmer than the second half. In Iceland, most glaciers lost mass after 1930 because temperatures temporarily rose by .6 degrees Celsius. But since then the climate has gotten colder, and since 1970 the glaciers have been growing. Including eleven glaciers which are surging in size. 2. Contrary to popular belief, Antarctica is NOT melting. Only the Antarctic peninsula (a relatively small portion of the continent) is melting, but the continent as a whole is getting colder and the ice is growing thicker. In fact: a. From 1986 to 2000 central Antarctic valleys cooled .7 degrees Celsius per decade with serious ecosystem damage from the cold. (Doran, P.T., Priscu, J.C. Lyons, W.B. Walsh, J.E., Fountain, A.G. McKnight, D.M. Moorhead, D.L. Virginia, R.A. Wall, D.H. Clow, G.D. Fritsen, C.H. Mckay, C.P. and Parson, A.N., 2002, "Antarctic climate cooling and terrestrial ecosystem response," Nature, 415: 517-20). b. Side-looking radar measurements show West Antarctica ice is increasing at 26.8 gigatons/yr. Reversing the melting trend of the last 6000 years. (Joughlin, I., and Tulaczyk, S., 2002, "Positive mass balance of the Ross Ice Streams, West Antarctica," Science 295: 476-80). c. Antarctic sea ice has increased since 1979. (Liu, J., Curry, J.A., and Martinson, D.G., 2004, "Interpretation of recent Antarctic sea ice variability," Geophysical Research Letters 31: 10.1029/2003 GLO18732). d. The greater part of Antarctica experiences a longer sea-ice season, lasting 21 days longer than it did in 1979. (Parkinson, C.L. 2002, "Trends in the length of the southern Ocean sea-ice season, 1979-99," Annals of Glaciology 34: 435-40). 3. The arrival of global warming was announced in 1988 dramatically by James Hansen, a prominent climatologist. He predicted temperatures would rise by .35 degrees Celsius over the next ten years. The actual increase was .11 degrees Celsius (that's less than 1/10 of a degree folks). After ten years Hansen claimed that the forces which govern climate changes are so poorly understood that long-term prediction is impossible. Quote, "The forcings that drive long-term climate change are not known with an accuracy sufficient to define future climate change." His prediction was off by over 300 PERCENT, proving that scientists don't know what they're talking about when it comes to predictions in this field. (James E. Hansen, Makiko Sato, Andrew Lacis, Reto Ruedy, Ina Tegen, Copyright, © 2012 J. Marczyk
11
Practical Complexity Management, Part II and Elaine Matthews, "Climate Forcings in the Industrial Era," Proceedings of the National Academy of Sciences 95 [October 1998]: 12753-58). 7. There are around 160,000 glaciers in the world. Only about 67,000 have been inventoried, and only a few studied with any care. There is mass balance data extending five years or more for ONLY 79 GLACIERS IN THE ENTIRE WORLD. No one knows whether they're all melting, or if even most of them are. (H. Kieffer, et al., 2000, "New eyes in the sky measure glaciers and ice sheets," EOS, Transactions, American Geophysical Union 81: 265, 270-71. Also R.J. Braithwaite and Y. Zhang, "Relationships between interannual variability of glacier mass balance and climate," Journal of Glaciology 45 [2000]: 456-62)." In the case of climate change, there are two camps. Eventually, some will be right, some will not. As far as the pre-meltdown economy was concerned nobody really got it right. We all were very wrong. What is the conclusion, then, that one could draw from this analogy? In our view, the situation as far our understanding of the economy is quite discouraging. When, in society, there is only one camp, the chance that all lose is very high. When there are two or more camps, there will ultimately be losers but there will also be winners. In the future, impregnated by uncertainty and chaos, crises will become inevitably more frequent. Some may even be more severe than the current one. For most companies, survival will be a measure of success. It is paramount to realize that conventional risk models (this includes rating models, such as those issued by the famous rating agencies) do not apply in a turbulent economy. A rating, such as a triple A, estimates the probability of insolvency (Probability of Default) within periods of years. Now, in a world in which the future is constantly under construction, how meaningful and how reliable can such information be?
Copyright, © 2012 J. Marczyk
12
Practical Complexity Management, Part II
2. The Present and Future of the Concept and Value of Ratings
“Rating agencies? Do not speak evil of the dead” (Corriere della Sera, 15/1/2009). This is an extreme synthesis of the widespread opinion and of the doubts placed on rating agencies and on the value of the concept of rating. The sub prime crisis, Enron, Lehman Brothers, or Parmalat are just a few eloquent examples of how a business or a financial product may collapse shortly after being awarded a very high investment-grade rating. But are these isolated cases, which expose the weaknesses of rating processes, or just the tip of an iceberg? Our intention is to provide a critical overview of the concept of rating and to question its conceptual validity and relevance. We believe that the increasing levels of global uncertainty and interdependency – complexity in other words – as well as the socio-economic context of our times, will place under pressure and scrutiny not only the concept of rating, but all conventional end established risk management and Business Intelligence techniques. Unquestionably, the concepts of risk and risk rating lie at the very heart of our troubled global economy. The question is in what measure did rating contribute to the trouble and what, in alternative, can be done to improve or replace it altogether. The concept of rating is extremely attractive to investors and decision-makers in that it removes the burden of having to go over the books of a given company before the decision is made to invest in it or not. This job is delegated to specialized analysts whose work culminates in the rating. The rating, therefore, is an instrument of synthesis and it is precisely this characteristic that has made it so widespread. However, contrary to popular belief, a rating is not, by definition, a recommendation to buy or sell. Nor does it constitute any form of guarantee as to the credibility of a given company. A rating is, by definition, merely an estimate of the probability of insolvency or default over a certain period of time. In order to assign a rating, rating agencies utilize statistical information and statistical models which are used in sophisticated Monte Carlo Simulations. In other words, information on the past history of a company is taken into account (typically ratings are assigned to corporations which have at least five years of certified balance sheets) under the tacit assumption that this information is sufficient to hint what the future of the company will look like. In a “smooth” economy, characterized by prolonged periods of tranquility and stability this makes sense. However, in a turbulent, unstable and globalized economy, in which the “future is under construction” every single day, this is highly questionable. The process of rating, whether applied to a corporation or to a structured financial product, is a highly subjective one. The two main sources of this subjectivity are the analyst and the mathematical models employed to compute the probability of insolvency. It is in the mathematical component of the process that we identify the main weakness of the concept of a rating. A mathematical model always requires a series of assumptions or hypothesis in order to make its formulation possible. In practice this means that certain “portions of reality” have to be sacrificed. Certain phenomena are so complicated to model that one simply neglects them, hoping that their Copyright, © 2012 J. Marczyk
13
Practical Complexity Management, Part II impact will negligible. As already mentioned, this approach may work well in situations dominated by long periods of continuity. In a highly unstable economy, economy, in which sudden discontinuities are around every corner, such an approach is doomed for failure. In fact, the usage of models under similar circumstances constitutes an additional layer of uncertainty with the inevitable result of increasing the overall all risk exposure. But there is more. In an attempt to capture the discontinuous and chaotic economy, models have become more complex and, therefore, even more questionable. In the recent past we have observed the proliferation of exotic and elaborate com computer models. However, a complicated computer model requires a tremendous validation effort which, in many cases, simply cannot be performed. The reason is quite simple. Every corporation is unique. Every economic crisis is unique. No statistics can capture capture this fact. No matter how elaborate. Moreover, the complexity and depth of recently devised financial products has surpassed greatly the capacity of any model to embrace fully their intricate and highly stochastic dynamics, creating a fatal spill spillover effect into the so-called called real economy. We therefore stress with strength the fact that the usage of models constitutes a significant source of uncertainty which is superimposed on the turbulence of the global economy, only to be further amplified by the subjectivity subjectivity of the analyst. In a discontinuous and “fast” economy no modeling technique can reliably provide credible estimates of the probability of insolvency, not even over short periods of time. It is precisely because of this fundamental fact that the concept ncept and value of rating become highly questionable. For most companies today, survival is going to be their measure of success. For example, an AA and BB rating indicate, respectively, a probability of insolvency of 1.45% and 24.57% within a period of 15 years. What is astonishing is not just the precision with which the probability is indicated, but the time span embraced by the rating. The degree of resolution – the number of rating classes – is unjustified in a turbulent economy. The fact that a rating agency can place a corporation in one of 25 or more classes indicates that there is sufficient “precision in the economy” and in the models to justify this fact. Clearly, the economy is not that precise. Table 1 indicates rating classes as defined by the three major rating agencies (number in parentheses indicates number of classes). Table 1.
It is precisely this attempt to search for precision in a highly uncertain and unpredictable environment that casts many doubts on the concept, relevance and value of a rating. We basically sustain that the whole concept is flawed. We see the flaw in the fact that the process of assigning a rating is essentially attempting to do something “wrong” (compute the probability of insolvency) Copyright, © 2012 J. Marczyk 14
Practical Complexity Management, Part II but in a very precise manner. An interesting parallel may be drawn between rating and car crash. Just like corporations, cars are also rated: for crashworthiness. Just like in the case of rating agencies, there exist different organisms that are certified to issue car crash ratings. A car crash rating is expressed by the number of stars – 1 to 5 – 5 being the highest. A car with a 5 star rating is claimed to be safe. Where is the problem? A crash rating is obtained in a test lab, in clinical conditions. What happens on the road is a different story. A crash rating tells you what happens under very precise but unrealistic conditions. In reality, a car will collide with another car traveling at an unknown speed, of unknown mass, with an unknown angle, and not with a fixed flat cement wall at 55 kph and at 90 degrees. So, a crash rating attempts to convey something about the future it cannot possibly catch. Just like in the case of corporate rating, computer crash simulations use state of the art stochastic models, are computationally very intensive and attempt to provide precise answers for unknown future scenarios. In summary, rating is an instrument which, directly or indirectly, synthesizes and quantifies the uncertainty surrounding a certain corporation or a financial product, as they interact with the respective ecosystems. This not only is extremely difficult but, most importantly it misses the fundamental characteristic of our economy and namely it’s rapidly increasing complexity. This complexity, which today may actually be measured, contributes to a faster and more turbulent ecosystem with which companies will confront themselves in their struggle to remain in the marketplace. This new scenario suggests new concepts that go beyond the concept of rating. Corporate complexity, as will be shown, occupies a central position. Complexity, when referred to a corporation, can become a competitive advantage providing it is managed. However, we identify excessive complexity as the main source of risk for a business process. In particular, high complexity implies high fragility, hence vulnerability. In other words, excessively complex corporations are exposed to the Strategic Risk of not surviving in their respective marketplaces. Evidently, high fragility increases the probability of default of insolvency. The concept is expressed synthetically via the following equation: C_corporation X U_ecosystem = Fragility (1) The significance of this simple equation is very clear: the fragility of a corporation is proportional to its complexity and to the uncertainty of the marketplace in which it operates. Complexity amplifies the effects of uncertainty and vice-versa. In practice what the equation states is that a highly complex business can survive in an ecosystem of low uncertainty or, conversely, if the environment is turbulent, the business will have to be less complex in order to yield an acceptable amount of fragility (risk). High complexity implies the capacity to deliver surprises. In scientific terms, high complexity manifests itself in a multitude of possible modes of behavior the system can express and, most importantly, in the capacity of the system to spontaneously switch from one such mode to another and without early warning. The fragility in the above equation is proportional to the strategic risk of being “expelled” from the marketplace. The weakness of the concept of rating is rooted in the fact that it focuses exclusively on the uncertainty aspects of a corporation without taking complexity into account. Clearly, as equation 1 shows, uncertainty is only part of the picture. In more stable markets, neglecting complexity did not have major consequences. In a less turbulent economy, business fragility is indeed proportional to uncertainty. When turbulence becomes the salient characteristic of an economy, complexity must necessarily be included in the picture as it plays the role of an amplification factor. In fact, the mentioned turbulence is a direct consequence of high complexity (the capacity to surprise and switch modes). Copyright, © 2012 J. Marczyk
15
Practical Complexity Management, Part II The constant growth of complexity of our global economy has recently been quantified, in an analysis of the GDP evolution of the World’s major economies. It is interesting to note how in the period 2004-2007 the global economy has doubled its complexity and how half of this increment took place in 2007 alone, see Figure 1. Similarly, one may observe how the complexity of the US economy also grew albeit in the presence of strong oscillations, denoting an inherently nonstationary environment.
Figure 1. Evolution of complexity (second curve from bottom) of the World and US economies on the period 2004-2007. The evolution of complexity, and in particular its rapid changes, act as crisis precursors. Recent studies of the complexities of Lehman Brothers, Goldman Sachs and Washington Mutual, as well as of the US housing market, have shown how in all these cases a rapid increment of complexity took place at least a year before the information of their difficulties became of public domain. Today it is possible to rationally and objectively measure the complexity of a corporation, a market, and a financial product or of any business process. Complexity constitutes an intrinsic and holistic property of a generic dynamic system, just like, for example, energy. Evidently, high complexity implies high management effort. It also implies the capacity to deliver surprises. This is why humans prefer to stay away from highly complex situations – they are very difficult to comprehend and manage. This is why, with all things being equal, the best solution is the simplest that works. But there is more. Every system possesses the so-called critical complexity – a sort of physiological limit, which represents the maximum amount of complexity a given system may sustain. In proximity of its critical complexity, a business process becomes fragile and exposed hence unsustainable. It is evident, that the distance from critical complexity is a measure of the state of health or robustness. In 2005 Ontonix has developed rational and objective measures of complexity and critical complexity, publishing templates for a quick on-line evaluation of both of these fundamental properties of a business process. The templates are based on financial highlights and standard balance sheet entries. The fundamental characteristic of the process of complexity quantification is that it doesn’t make use of any mathematical modeling technique (stochastic, regression, neural networks, statistics, etc.). The method in fact is a so-called model-free technique. This allows us to overcome the fundamental limitation of any model which, inevitably, involves simplifications and hypotheses which, in most cases, are rarely verified. As consequence, the method is objective. Data is analyzed as is, without making any assumptions as to their Gaussianity and continuity and without any prefiltering of pre-conditioning. As a result, no further uncertainty, which would contaminate the result, is added. At this point it becomes evident how complexity can occupy a central position in a Copyright, © 2012 J. Marczyk
16
Practical Complexity Management, Part II new approach to the problem of rating. It is in fact sufficient to collect the necessary financial data, compute the complexity and corresponding critical value and to determine the state of health of the underlying business as follows: State of health of corporation = Critical complexity - current complexity (2) The closer to its critical complexity the more vulnerable is the business. Simple and intuitive. A paramount property of this approach is that, unlike in the case of a rating, it does not have a probabilistic connotation. In other words, no mention is made as to the future state of the corporation. All that is indicated is the current state of health, no prediction is advanced. The underlying concept is: a healthy organization can better cope with the uncertainties of its evolving ecosystem. The stratification of the state of health (or fragility) is operated on five levels: Very Low, Low, Medium, High and Very High. In highly turbulent environments, attempting to define more classes of risk is of little relevance. It is impossible to squeeze precision of out of inherently imprecise systems. Let us illustrate the above concepts with an example of a publicly traded company. The computation of the state of health (rating) has been performed using fundamental financial parameters (see blue nodes in the graph in Figure 2) as well as certain macro-economic indicators (red nodes) which represent, albeit in a crude manner, the ecosystem of the corporation. Figure 2 illustrates the so-called Complexity & Risk Map of the corporation as determined using the on-line rating system developed by Ontonix (www.ontonix.com). The parameters of the business are arranged along the diagonal of the graph, while significant relationships between these parameters are represented by the connectors located away from the diagonal. Needless to say, the said relationships are determined by a specific model-free algorithm, not by analysts. The health rating of the company is “Very High” and corresponds, in numerical terms, to 89%. The map also indicates the so-called hub, or dominant variables, indicated as nodes of intense red and blue color.
Figure 2. Complexity & Risk Map of a corporation, indicating the corresponding health rating (business robustness). Copyright, © 2012 J. Marczyk
17
Practical Complexity Management, Part II The conventional concept of rating – intended as a synthetic reflection of the probability of insolvency – has shown its inherent limitations in a global, turbulent and fast economy. The proof lies in the crippled economy. The usage of mathematical models, as well as the subjectivity of the rating process, adds a further layer of uncertainty which is invisible to the eyes of investors and managers. Under rapidly changing conditions and in the presence of high complexity, the concept of probability of default is irrelevant. Instead, a more significant rating mechanism may be established based on the instantaneous state of health of a corporation, as its capacity to face and counter the uncertainties of its marketplace. In other words, the (strategic) risk of not being able to survive in one’s marketplace is proportional to the mentioned state of health. The capacity of a corporation to survive in its market place does not only depend on how turbulent the marketplace is but, most importantly, on how complex the corporation is. This statement assumes more importance in a highly turbulent economic climate. If corporations do not start to proactively control their own complexity, they will quickly contribute to increasing even more the turbulence of the global marketplace, making survival even more difficult. Complexity, therefore, is not just the basis of a new rating mechanism, it establishes new foundations of a superior and holistic form of Business Intelligence.
Copyright, © 2012 J. Marczyk
18
Practical Complexity Management, Part II
3. Complexity: The Fifth Dimension
What do economics, medicine, engineering and social sciences have in common? Depending on the point of view, quite a lot. In reality, these fields are, at least in our every-day perception, quite distant. However, as our multi-disciplinary work shows, there exist features which are common to system operating in all these environments. A unifying dimension is complexity and its dynamics, and in particular certain patterns which characterize its evolution. Complexity, as we sustain, is not a process on the edge of chaos (such a "definition" is actually quite useless in that it doesn't really hint any means of measuring complexity) but an intrinsic property of every dynamical system as described by its state vector. Therefore, it is logical to expect that complexity would be a timedependent quantity. In effect, this is indeed the case. Below we illustrate the evolution of the complexity of an eco-system.
States of high complexity correspond to periods of high "activity" - creation of structure for example - or generation of entropy. Sudden kicks and jumps point to "traumatic events" or phasechanges. Complexity, just like energy, for example, is a quantity which reflects much of what is going on inside a given system and, most importantly, on how it fails. Copyright, © 2012 J. Marczyk
19
Practical Complexity Management, Part II There are numerous schools and centers offering courses on complex systems. We wonder in how many of these courses is:
Complexity defined Measured Tracked over time Really managed (i.e DELIBERATELY going from one value of complexity to another)
Galileo taught us that serious science starts when you begin to measure. Putting into one pot nonlinear differential equations, chaos theory, fractals, agent-based modeling, control theory, fuzzy sets, Monte Carlo methods, artificial neural networks, stochastic dynamics, statistics and the inverted pendulum control problem is NOT equivalent to teaching complexity science. Saying that "the whole is greater than the sum of the parts" is a "definition" of complexity is confusing to say the least. The propagation of nice slogans is not doing serious science. Once measured, complexity reveals not only its fascinating nature, it also exposes numerous intimate properties of dynamical systems which remain invisible to traditional techniques. For example the concept of mode emerges naturally from our formulation. Below we indicate a dynamical system which is described via a 6-th order state vector and for which the System Map is determined, as well as all (5) the possible modes of behavior that the system may exhibit in its Fitness Landscape.
What this means is that the system in question can behave according to 5 different patterns, or modes, and to transition from one mode to another. As an example of what is meant by mode we indicate the four seasons (Spring, Summer, Autumn and Winter) in a given climatic zone. In this particular case the sequence of mode transitions is clear: Spring -> Summer, Summer -> Autumn, etc. However, in a generic system, the order of modal transitions is by no means evident. There may exist very clear and stable modal transition sequences as well as more "volatile" situations under Copyright, © 2012 J. Marczyk 20
Practical Complexity Management, Part II which the system spontaneously jumps from one mode to another in a random manner and without any early warning. The so-called System Map is an envelope of all the modal topologies and must not be mistaken for a conventional correlation map in which all relationships between variables are seen as simultaneously taking place. This is clearly a simplistic view of things and can be tremendously misleading. One single correlation map conveys a very poor and empty message.
Copyright, © 2012 J. Marczyk
21
Practical Complexity Management, Part II
4. A New Theory of Risk
"What we need today is an adequate theory of risk", Giulio Sapelli, "La Crisi Economica Mondiale" ("The Global Economic Crisis", Bollati Boringhieri, 2008, Torino, Italy). We couldn't agree more. But if we redefine risk, we automatically need to take a different look at the concept of rating as well as at uncertainty itself. The cultural default to uncertainty is probability, which, in turn, constitutes an attempt to measure risk. The concept of probability, ultimately, lies at the very heart of the economy and not only of the economy. But even so, there is something wrong. Probability measures the chance of an outcome in an event involving elements of randomness (uncertainty). It is a controversial idea on which inferential statistics is based. Looking at the frequency of certain events - in a given historical period - and then assuming that this frequency will hold may work in a stationary realm but in a non-stationary environment it is deprived of significance. Nature doesn't work like that, even though inferential statistics has been hugely successful in helping analysts in drawing conclusions from data. In Nature, the concept of pattern is enormously more important than, for example, a correlation. Just like in the case of many other concepts, there are many misconceptions about probabilites. One example is the conjunction fallacy. The confusion between the significance of 'and' and 'or' determines in fact how many people (such as decision makers and managers) think and act. Moreover, in non-stationary environments, rich in entropy and dominated by dissipative and irreversible processes, the reliance upon the concept of causality (A causes B and B causes C therefore A causes C) may also lead to gross errors. However, it is not easy to eradicate these concepts as they are very deeply rooted and common error, when practiced by multitudes, quickly becomes law. The foundation of our criticism of the concept of probability (likelihood) lies in the fact that it is not a natural quantity. Quantities such as mass, energy, spin or velocity are not just concepts, they really DO exist and may be measured. Probability has a very "artificial" flavour. If you reverse the direction of time flow a large amount of physical laws still hold (like the law of gravity, or F=ma). What happens with probability is totally different. When you reverse time flow any value of probability changes to 1. Looking backward in time, all events that have happened share the probability of 1. For example, suppose that one asks a large number of politicians in 1938 as to the chances of a World War breaking out in 1939. Independently of what this probability may have been perceived to be (yes, perceived, since you cannot calculate it) the war DID actually break out. If an event effectively happens, it happens independently of whether the probability of its occurrence has been estimated to be 0.1 or 0.9. In actual fact, the number is irrelevant. A law which shows such a sharp onesidedness is probably not a good law and, in the least of cases, is of little use in making predictions or estimates of the future. Another example of such a law is: one day we will die. Certainly true (although Wittgenstein rightly said that the phrase "the Sun will rise tomorrow" is only a hypothesis, not a certainty). This fundamental and inexorable law which governs our existence is really of little practical use since it cannot be made to make predictions of one's lifespan. The reason is simple. We all live in our respective environments and we often move from Copyright, © 2012 J. Marczyk 22
Practical Complexity Management, Part II one environment to another. A very healthy person may live in a healthy environment and, according to logic, will probably enjoy a long (and heathy) life. But if he starts to smoke, or moves to a highly toxic environment - none of these events can be predicted - the lifespan will be reduced. The law "one day we will die" is thefore of no value. It simply holds but, due to its very nature (it simply states the Second Law of Thermodynamics in an understandable fashion) it cannot produce any quantitative measure as to how and when the event of death will take place. It is not so much the system itself as its interaction with its environment that determines the degree of its evolutionary success and longevity. Since nothing really happens in isolation, statements involving probabilites of occurrence of discrete events are of very little value. In particular if nonstationary environments are involved. One could of course argue that "one day we will die" is 100% natural, and that there is nothing artificial about it. True, but it is merely a statement of the obvious and this doesn't guarantee that the statement has any practical value as no rational a-priori measure of lifespan exists. The probability of insolvency over a certain period (e.g. ten years) is mapped onto a "scale" composed of symbols, such AAA or AA+, and is called a rating. The rating supposedly hints a measure of risk and many important business decisions are based on the concept of rating. Humans need to "measure" risk in order to make decisions, even though risk cannot be measured simply because probability cannot really be measured. The fact that out of 1000 people who climbed a certain mountain nobody got killed in the process doesn't mean that the 1001-st will not. Each climber is different and, what is often ignored, the mountain ages. Every climber really gets to climb a different mountain. Risk, when seen as "exposure to uncertainty", is often measured as the amount of loss one faces in the perspective of an unfavourable outcome. If you walk into a casino with a $1000 you really are risking $1000. If you place $500 on the table then your exposure is $500. No matter what amount you play, that is the amount you are risking. Again, a statement of the obvious. Of very little use. Evidently, no one can predict how much of the $1000 will be lost within the first 10 minutes at a roulette table and with what probability. This, of course, is what a rating is attempting to convey not in the case of a gambler but in that of a corporation. It is of no use to know how much a given gambler has lost in the past and with what game strategy. No statistic holds. Each event (game) is unique, every roulette table is unique. The same strategy will never lead to the same outcome in a random environment. The fact that you happen to live in a country where life expectancy is high is no guarantee that you will live a long life. Every statistic is a statement of the obvious and has little or no predictive value in non-stationary environments. The true problem is that humans seek certainty in an uncertain world. But why insist on wanting certainty in a nondeterministic reality in the face of overwhelming evidence? Why does 31% of the American public believe in horoscopes even though there is no scientific foundation for doing so? Why do so many people smoke when it has been proven, beyond any doubt, that smoking is a deadly habit? Why do we insist on not seeing the patterns that Nature so abundantly displays before our own eyes? Why do people want to know the future ahead of time, knowing that it cannot possibly be predicted? Why is it that the only thing we learn from history is that we don't learn from history? It is imperative that managers and decision-makers finally understand that the future cannot be predicted and that uncertainty cannot be controlled - it can only be managed. We need to re-think the concept of risk and to concentrate more on understanding the patterns of interplay that govern a dynamical system (e.g. corporation) in its interaction with an uncertain environment (market) in a totally new light. We believe that the very concept of risk, and in particular the way in which its quantification is attempted, lacks a solid foundation. It must be replaced with something different. That something is fragility. Business fragility, which can be measured based on the concepts of complexity and critical complexity, quantifies how far that business is from becoming impossible to manage. The equation is: Copyright, © 2012 J. Marczyk
23
Practical Complexity Management, Part II Fragility = C/C_cr A second fundamental characteristic of a business is its stability. This too is easy to compute once a measure of complexity is avaliable. The definition is: Stability = dC/dt In both equations C stands for the current value of complexity and C_cr is the corresponding critical value. These two equations establish an new and unconventional but natural look at "risk". More soon.
Copyright, © 2012 J. Marczyk
24
Practical Complexity Management, Part II
5. If You Really Need to Optimize ....
We sustain that optimal solutions are fragile and should be generally avoided. This unpopular statement enjoys substantial practical and philosophical argumentation and now, thanks to complexity, we can be even more persuasive. However, this short note is about making optimization a bit more easy. If you really insist on pursuing optimality, there is an important point to keep in mind. Let us examine the case illustrated below: designing a composite conical structure in which the goal is to keep mass under control, as well as the fluxes and axial and lateral frequencies. The System Map shown below reflects which (red) design variables (ply thicknesses) influence the performance of the structure in the nominal (initial) configuration, prior to optimization. In addition, the map also illustrates how the various outputs (blue nodes) relate to each other. In fact, one may conclude that, for example, the following relationships exist:
t_20013 - Weight Weight - Axial frequency Min Flux - Max Flux t_20013 controls the Lateral Frequency t_20013 also controls the Axial Frequency But Lateral Frequency and Axial Frequency are related to eachother etc.
As one may conclude, the outputs are tightly coupled: if you change one you cannot avoid changing the others. Let's first see how optimization is handled when one faces multiple - often conflicting objectives:
Copyright, © 2012 J. Marczyk
25
Practical Complexity Management, Part II
minimize y = COST (y_1, y_2, ..., y_n) where y_k stands for the k-th performance descriptor (e.g. mass, stiffness, etc.). In many cases weights are introduced as follows: minimize y = COST (w_1 * y_1, w_2 * y_2, ..., w_n * y_n). The fundamental problem with such a formulation (and all similar MDO-type formulations) is that the various performance descriptors are often dependent (just like the example above indicates) and the analysts doesn't know. The cost function indicated above is a mathematical statement of a conflict, whereby the y's compete for protagonism. This competition is driven by an optimization algorithm which knows nothing of the structure of the corresponding System Map and of the existence of the relationships contained therein. Imagine, for example that you are trying to reduce a variable (e.g. mass) and increse, at the same time another (e.g. frequency). Suppose also that you don't know that these two variables are strongly related to eachother: the relationship looks typically like this: f = SQRT(k/m). Here, f and m, outputs of the problem, are related - changing one modifies the other. This is inevitable. In a more intricate situation, in which hundreds of design variables are involved, along with tens or hundreds of performance descriptors, the problem really becomes numerically tough. The optimization algorithm has a very hard time. What is the solution? If you cannot avoid optimization, then we suggest the following approach:
Define your baseline design. Run a Monte Carlo Simulation, in which you randomly perturb the design variables (inputs). Process the results using OntoSpace, obtaining the System Map. Find the INDEPENDENT outputs (performance descriptors), or, in the case the aren't any, those outputs which have the lowest degree in the System Map. There are tools in OntoSpace that actually help to do this. Copyright, © 2012 J. Marczyk
26
Practical Complexity Management, Part II
Build your cost function using only those variables, leaving the others out.
This approach "softens" the problem from a numercial point of view and reduces the mentioned conflicts between output variables. Attempting to formulate a multi-disciplinary problem without knowing a-priori how the various disciplines interact (i.e. without the System Map) is risky, to use a euphemism.
Copyright, © 2012 J. Marczyk
27
Practical Complexity Management, Part II
6. “Don’t Cross a River Because it is On Average 4 feet Deep”
If that quote sounds like ancient Eastern mysticism, you would be surprised to know when it was made! In today’s tough economic times though, one wishes that people who were put in charge of making decisions had heeded this simple wisdom. But this is precisely what was overlooked by the financial market gurus – decisions were made because the calculated average loss in an investment was only “4-feet deep”! The erstwhile Lehman Brothers remains a poster-child for the fallacy of this sort of thinking. Conventional risk management, of the variety used by the likes of Lehman, involves calculating the probability of the maximum loss a given portfolio can experience over a specified time horizon. Let us unpack this statement for what its worth: 1. We first need to specify a time horizon – fairly simple, 2. We need to compute the return of this portfolio over this time period – any financial website can feed a spreadsheet to do this, 3. Then we need to state the probability that the return (or loss) will be below (or above) a certain value. This is the part which can be a slippery slope – because it rests on faith. That is correct – a belief. In mathematical parlance, it is called an “assumption”. Almost all conventional financial theory makes the sinfully simple assumption that these returns are normally distributed. This innocuous assumption allows one to wield fancy and dauntingly complex mathematical equations to “calculate” the probabilities. Of course anything sprinkled with mathematical holy water suddenly becomes untouchable to any degree of skepticism. To make matters even more rarified, models are introduced. Let us take the case of the sub-prime mortgage mess. All the major investment banks had an extensive arsenal of models to figure out risk. Yet, they failed. One then has to question either the models or the modelers themselves. Let us look at the models first: Fortune magazine identified three fatal flaws with these models – limited historical information, an unfortunate choice of time horizon to set up the assumptions, and finally – the kicker – the models were “incomplete”! The first flaw is (in hindsight) easy to recognize. There were limited sub-prime securities in the past and all previous models were constructed using prime mortgages. So now we have a situation of Copyright, © 2012 J. Marczyk
28
Practical Complexity Management, Part II using the yardstick to measure a piece of furniture and the wavelength of light. The second flaw was that these models were fed (the limited) sub-prime data at a time when the sub-prime defaults actually dropped from 10% to 5%. This was of course during the ra-ra period of Greenspanian interest rate drops (2001-2005). So in effect the models resulted in a self-fulfilling prophecy – we start with an assumption of tightly distributed results and then we predict solid performance for the future. This brings us to the third flaw - the incompleteness situation. Once again hindsight tells us with mock arrogance that assuming the absence of network effects was a fatal mistake. The models basically assumed that the sub-prime mortgage market existed in a vacuum. They assumed no relationship to the broader financial market, heck they even assumed that the sub-prime market was disconnected with the real estate market! Popular beliefs such as “all real estate market is local”, buttressed the ideas that if the market was overheated in Las Vegas, this should not affect the prices in Washington. Clearly in the short run this may have been true – but in the globalized, highly networked world of instantaneous information – what was a long run previously is now a short run. Obviously, the socio-economic clock speed has kept up with that of what Intel churns out. Now let’s look at the modelers themselves: If throwing the screen of mathematical assumptions to overlook something is like throwing a veil, models are like building a citadel and surrounding it with a crocodile infested moat to safeguard those assumptions. The few who can cross the moat and scale the citadel, are obviously wary of lifting the veil. Naturally, because they don’t want to seem to have come that far to find out the veil hides something not quite as beautiful as they had imagined. The time has never been righter for a model-free world. Let us remind ourselves of a famous statistician’s insight – all models are wrong, but some are useful. What is needed is a paradigm shift for risk management where the focus is on using models to understand and explain the past, but not pretending to predict the future. The usefulness of models, in George E.P. Box’ interpretation was the insights one could get from them about the future, if and only if things behaved like they always did in the past. Management focus should be on balancing all the historical information to develop a pragmatic strategy. This challenge quickly becomes non-trivial when dealing with systems with high complexity. What is needed is a tool that can identify patterns and parameters which affect complexity, and shift management focus on identifying sources of complexity in order to mitigate or eliminate them. What are not needed are more models that pretend to predict risk. And of course, some common sense is always helpful. So the question is how do we identify sources of complexity? Can measuring complexity tell us how healthy a system really is? Why couldn’t Nobel winning economists and star analysts fail to anticipate crisis like the one we find ourselves in? (Actually there were some who did – but nobody paid attention to them. More next week …) Now, if you are still curious about the headline quote, see here.
Copyright, © 2012 J. Marczyk
29
Practical Complexity Management, Part II
7. Complexity: The Fifth Dimension (II)
When complexity is defined as a function of structure, entropy and granularity, examining its dynamics reveals its fantastic depth and phenomenal properties. The process of complexity computation materializes in a particular mapping of a state vector onto a scalar. What is surprising is how a simple process can enshroud such an astonishingly rich spectrum of features and characteristics. Complexity does not possess the properties of an energy and yet it expresses the "life potential" of a system in terms of the modes of behaviour it can deploy. In a sense, complexity, the way we measure it, reflects the amount of fitness of an autonomous dynamical system that operates in a given Fitness Landscape. This statement by no means implies that higher complexity leads to higher fitness. In fact, our research shows the existence of an upper bound on the complexity a given system may attain. We call this limit critical complexity. We know that in proximity of this limit, the system in question becomes delicate and fragile and operation close to this limit is dangerous. There surely exists a "good value" of complexity - which corresponds to a fraction, ß, of the upper limit - that maximizes fitness: Cmax fitness = ß Ccritical We don't know what the value of ß is for a given system and we are not sure on how it may be computied. However, we think that the fittest systems are able to operate around a good value of ß. Fit systems can potentially deploy a sufficient variety of modes of behaviour so as to respond better to a non-stationary environment (ecosystem). The dimension of the modal space of a system ultimately equates to its degree of adaptability. Close to critical complexity the number of modes, as we observe, increases rapidly but, at the same time, the probability of spontaneous (and unwanted) mode transitions also increases quickly. This means the system can suddenly undertake unexpected and potentially self-compromising actions (just like adolescent humans). More to come.
Copyright, © 2012 J. Marczyk
30
Practical Complexity Management, Part II
8. Stability, Not Growth
A recent article that appeared on cnn.com (26-th february, 2009) reports a record a record 34billion dollar loss of the Royal Bank of Scotland - the biggest in British corporate history. According to the management of the bank, they will now:
focus on the UK (and less on global operations) - this means a smaller Complexity & Risk map more on core-business - identify the business hubs and focus on these restructuring and management changes - dumping entropy isolation of toxic assets (entropy reduction) - further entropy elimination
The above have a name: complexity reduction. And we know how to do it rationally. Corporations often grow via acquisitions, purchasing businesses they don't understand, simply to show growth and impress investors. And to get bigger bonuses. By doing so, they increase complexity quickly which, inevitably, makes the management of the growing behemoth increasingly difficult. In times of turbulence it is more important to focus on business stability than on growth. The following equation explains why: Business complexity X Turbulence of Economy = Fragility
Copyright, © 2012 J. Marczyk
31
Practical Complexity Management, Part II
9. The Crisis Will End in 2010. The FED Says.
A stunning article appeared today (15/3/2009) on the CNN website. According to the FED, recovery from the crisis will begin in 2010. Not this year, not in 2011. It will be in 2010. A few comments that come to mind:
The current crisis has come as a total surprise. Nobody has been able to predict it. This is because traditional math is simply unable to predict similar events. Nobody can predict a crisis because each crisis is unique. This means that stats is irrelevant and has no predictive power. Nobody had predicted when the crisis would have come, nobody warned of its proportions. Still today, nobody has a clue of the true depth and dimenson of the crisis. This is becuse traditional math cannot help in this sense. The complexity of the crisis is unknown. One of the reasons is that the complexity of the toxic financial products that have contributed to the crisis is also unknown. Those who have created these products have no clue as to the depth of the damage they have caused. How, then, can economists predict when the crisis will be over? The economy is still full of the entropy-generating metastasizing derivates. On what math are the 2010 recovery statements based? Clearly it cannot be based on the same math that threw the global economy into this mess! So, on what new math are these statements made? The math that got us into trouble will not get us out of it. A new math is needed. If you SUDDENLY get an unknown terrible illness, that has never been experienced before, how can ANYBODY have a clue as to when you will recover if no known medicine exists?
Copyright, © 2012 J. Marczyk
32
Practical Complexity Management, Part II
10. Beyond the Concept of Risk. New Paradigms in Turbulent Times
The concept of risk lies at the very heart of the economy and, unquestionably, behind the majority of our actions and decisions. The management of risk is an integral and indispensable part of every serious business, bank or corporation. Most large companies now have a Chief Risk Officer. Risk rating is a fundamental instrument in the hands of investors and decision-makers. We question strongly not only the concept of risk rating, which we feel is probably the single most important cause behing the current economic crisis, we also sustain the the practice of risk management is outdated and must be replaced with a more modern and adequate management of complexity. Let us recall briefly the main points in Thomas Kuhn's book "The Structure of Scientific Revolutions":
When you run out of ideas, fragmentation kicks in. Small incremental improvements instead of innovation. This leads to a crisis. After a prolonged crisis we have a revolution. When the revolution is over, a new paradigm replaces the old one.
Revolutions are an important means of changing paradigms. We are now in the middle of a severe crisis which will, hopefully, revolutionize the economy (as well as lifestyles). Never waste a good crisis. We have a great opportunity to change things, to move forward, to innovate. Conventional risk rating is applicable in a "smooth" trauma-free economy, characterized by a low content of entropy and a moderate degree of connectivity. In a turbulent and globalized economy, which is entropy-rich and highly interconnected - i.e. highly complex - the concepts of risk and risk rating must be reviewed. A change of paradigm is necessary.
Copyright, © 2012 J. Marczyk
33
Practical Complexity Management, Part II We now know that excessive complexity is the main source of vulnerability. We also know that rapidly increasing complexity undermines the stability (continuity) of a business process. We therefore propose the following new paradigm.
Risk Management transitions into Complexity Management. Risk Rating - which measures the probability of insolvency over a period of time - is replaced by a measure of the current Vulnerability of a business.
To complete the new picture, a holistic approach to Business Intelligence should be taken, incorporating the interaction of a company with its dynamically changing ecosystem. Finally, concentrating more on global patterns and less on local details, as well as the trend away from statistics to model-free methods will help establish the foundations of a more modern, rational and resilient (global) economy.
Copyright, © 2012 J. Marczyk
34
Practical Complexity Management, Part II
11.
Correlation, Regression and how to Destroy Information
(The above image is from an article by Felix Salomon - 23/2/2009). When a continuous domain is transferred onto another continuous domain, the process is called transformation. When a discrete domain is transferred onto another discrete domain, the process is called mapping. But when a discrete domain is transferred onto a continuous domain, what is the process called? Not clear, but in such a process information is destroyed. Regression is an example. Discrete (often expensive to get) data is used to build a function that fits the data, after which the data is gently removed and life continues on the smooth and differentiable function (or surface) to the delight of mathematicians. Typically, democratic-flavoured approaches such as Least Squares are adopted to perpetrate the crime. The reason we call Least Squares (and other related methods) "democratic" (in democracy everyone gets one vote, even assassins who get re-inserted into society, just as respectful hard-working and law-observing citizens) is that every point contributes to the construction of the mentioned best-fit Copyright, © 2012 J. Marczyk
35
Practical Complexity Management, Part II function in equal measure. In other words, data points sitting in a cluster are treated equally with dispersed points. All that matters is the vertical distance from the sought best-fit function. Finally, we have the icing on the cake: correlation. Look at the figure below, depicting two sets of points lying along a straight line.
The regression model is the same in each case. The correlations too! But how can that be? These two cases correspond to two totally different situations. The physics needed to distribute points evenly is not the same which makes them cluster into two groups. And yet in both cases stats yields a 100% correlation coefficient without distinguishing between two evidently different situations. What's more, in the void between the two clusters one cannot use the regression model just like that. Assuming continuity a-priori can come at a heavy price. Clearly this is a very simple example. The point, however, is that not many individuals out there are curious enough to look a bit deeper into data (yes, even visually!) and ask basic questions when using statistics or other methods. By the way, "regression" is defined (Merriam Webster Dictionary) as "trend or shift to a lower or less perfect state". Indeed, when you kill information - replacing the original data with a best-fit line - this is all you can expect.
Copyright, © 2012 J. Marczyk
36
Practical Complexity Management, Part II
12. Toxic Financial Assets are Inevitable in a Complex Economy
When a socio-economical system reaches certain minimum levels of complexity, it spontaneously develops a number of inevitable mechanisms. One of them is globalization. Another is terrorism. This last one thrives especially if there is a lack of appropriate rules (laws) to govern the coexistence of humans in a society. Toxic assets, just like terrorism and globalization, are an inevitable consequence when the complexity of the global financial system reaches a certain levels and the system in question lacks rules of global international character. The complexity of the global financial system today is such that toxic assest are simply bound to emerge in order to fill the empty gaps. The lack of rules does the rest, leading to an entropy-rich environment in which not only toxic assets become obvious but in which fraudulent behaviour may develop undistrurbed. Just like terrorism is very difficult to deal with, and just like globalization cannot be "dismantled" - as many no-global groups would want - so toxic assets are extremely hard to comprehend and deal with. In order to prevent the emergence of toxi financial products, it is paramount to keep the complexity of our global economy under control. This is of course a monumental task but the technology is there. A great place to start is corporations and financial products such as derivates.
Copyright, © 2012 J. Marczyk
37
Practical Complexity Management, Part II
13.
TBTF - Too Big To Fail
"The Too Big to Fail (TBTF) policy is the idea that in economic regulation the largest and most interconnected businesses are Too Big To (let) Fail." (Wikipedia). We don't want to discuss the ethical implications of such a policy but to analyze the issue from a complexity standpoint. Clearly, "interconnected" points to complexity, although entropy is explicitly missing. In a recent hearing, Bernanke and Geithner illustrate how AIG "shows broad failures of system". TBTF companies, such as AIG or GM, have a systemic importance to the economy of a country. Their sheer size inevitably leads to numerous consequences:
They are hubs of the respective national economies. This means that good and bad things happening within such companies quickly spill into the respective ecosystems. They cannot (really?) be allowed to fail. When on the verge of insolvency they absorb precious taxpayers money. The don't need to innovate and they don't innovate. They tend to be very inefficient, dumping huge amounts of entropy into the economy and, as a consequence, contributing significantly to an overall increase complexity. They "function" on the verge of critical complexity. Banks, car manufacturers are good examples of huge and (inevitably) very fragile companies. Fraudulent activity in huge organizations is easier than in a smaller more efficient and lean business.
It is becoming evident that there is the need to put a cap on the size of companies - we think that the TBTF concept may give way to a limit in terms of complexity and critical complexity. Highly complex corporations are naturally more difficult to manage hence more exposed and more capable of delivering sudden and unexpected behaviour (this is the definition of fragility). A global economy must avoid corporations of systemic importance. Copyright, © 2012 J. Marczyk
38
Practical Complexity Management, Part II However, conventional BI technology is unable to actively manage huge organizations - it is simply not equipped for the scope. So how can you "measure the size" of a company? Revenue? Number of employees? The amount of taxes it pays? The number of Tier-1 suppliers it has? The size of the competiton? All theses numbers are useful but they do not give a global picture. Complexity does. In order to understand how large a company could get before it becomes dangerous to an economy one could proceed as follows:
Measure the complexity of a certain number companies (e.g. all the Fortune 500 companies, including the ones that have failed and those approaching insolvency). Rank the companies according to complexity. Confront the complexity ranking with some of the above business parameters.
With similar information in hand one can establish reasonable limits in terms of complexity and then to project that information onto other business parameters. More soon.
Copyright, © 2012 J. Marczyk
39
Practical Complexity Management, Part II
14.
More on Correlations and Causality
Humans learn very little from Nature and generally dedicate very little time to observing and understanding her patterns. For example, there is an overwhelming amount of evidence that excludes the possibility of a deterministic existence in our world. However, the majority of the public - and this includes the educated public too - seem to insist on bending reality to suit some distorted and warped vision of life or a particular computer tool. Some examples:
The world is stochastic, not deterministic. In such a context you cannot make predictions, only estimations. The future is always under constructions. This means that the predictive power of statistics is zero. In virtue of the laws of physics every single event is unique and unrepeatable. This is why we cannot learn from history. Or learn very little, even though many patterns are repeatable. Nature is full of discontinuities but the math universities teach starts with the statement: consider a continuous and differentiable function f(x). We know from life that very often the same conditions lead to different outcomes and yet the good old f(x) has only the same and one value for a given x. Some things can only be managed, never controlled. One of them is uncertainty.
The list could go on. However, in this short note we concentrate on other related concepts: correlation and causality. Causality, potentially, is a great thing, provided it exists. The concept of causality is simple: if event A takes place the event B takes place. This is equivalent to having control over B and there is nothing more attractive to humans than to establish and exercise a certain control action in a repeatable fashion. The problem lies in a simple point: in a context of dynamic uncertainty it is impossible to isolate stable relationships of the type "if A then B". In reality the situation is more or less like this: if many A's take place then many other B's will take place Rules come in packets, or clusters, and not only are linked and related to eachother, in most cases they are also incomplete. In other words, from a conventional (and practical) standpoint, causality Copyright, © 2012 J. Marczyk
40
Practical Complexity Management, Part II does not exist or, in the best of cases, there is very little of it out there. But if you get involved with a causal, control-oriented scheme of thinking, you inevitably bump into other "Newtonian" concepts such as correlation. Correlation is a by-product of causality and may be tremendously damaging. This is why. Imagine a set of points lying on a straight line. Suppose the points are distributed evenly. The correlation is 100%. Imagine another set of points, also lying on a straight line but this time distributed in, say, two distant groups, with a void separating these groups (clusters). The correlation is again 100%. In other words, in the language of statistics, both situations are the same. But from the standpoint of physics they are totally different. But there is more. In the void between the two clusters, we are not allowed to use any regression model because in the void no phenomenon takes place. The regression model (a line in this case) knows nothing of the physics that has aligned the points. The correlation knows even less. It is the poorest of quantities with a fantastic capability of condensing information into a single number without any physical significance whatsoever. Think of how much information can be obfuscated by correlations operating on data in multiple dimensions. Just take a look at the global economy.
Copyright, © 2012 J. Marczyk
41
Practical Complexity Management, Part II
15.
How Nature Works. Patterns, Not Details
How Nature Works. This is the title of a wonderful book by the late Par Bak, who is known for his work on self-organized criticality. Self-organized criticality is the bridge to order in a chaotic regime. However, in this short article we wish to draw attention to a more philosophical issue, regarding the basic mechanisms which Nature uses to "do things" (the term "design" is deliberately avoided).
Attractors (of behaviour) Solutions that are fit for the function, not optimal. Constant improvement - nothing is static Recognizable patterns, not repetable details.
Most importantly, patterns not details. But how do you describe mathematically a pattern (if it is not a fractal and has a neat equation)? well, conventional mathematics is not equipped to perform this task. Believe it or not, we cannot describe mathematically a leaf! A pattern is an organized inter-related set of rules. Fuzzy rules. When fuzzy rules are invoked thnigs don't necessarily happen in exactly the same identical fashion. This is why there are no two identical leaves, even though one can clearly recognize a common pattern, like in the case below.
Copyright, © 2012 J. Marczyk
42
Practical Complexity Management, Part II
The leaves don't need to be the identical to belong to the same family of plants or even to the same plant but they have in common a huge amount of features which the eye is quick to capture. So, how cay you mathematically represent a pattern? An example is shown below:
Each node in the above is a feature or characteristic of a system (e.g. color, mass, energy, shape, complexity, fragility, resilience, intelligence, fitness, age, etc.). At certain times in the life of a (living) system some features are dormant - this is why certain characteristics are excluded from the above map (empty circles). When a system evolves, or changes mode of behaviour, it switches from Copyright, © 2012 J. Marczyk 43
Practical Complexity Management, Part II one set of fuzzy rules to another. Systems often tend to behave in the vicinity of the so-called attractors. And often they jump from one attractor to another. Based on maps such as the one above, and on their dynamic properties, it is possible to conceive a new more "universal" and elegant type of math, that will no longer have trouble in describing Nature's patterns and helping us to better understand her. The pictures of the leaves have been taken in the Botanical Gardens in Sao Paulo, Brazil, on March 21-st, 2009.
Copyright, © 2012 J. Marczyk
44
Practical Complexity Management, Part II
16. How do You Measure the Impact of a Company on a Market?
How can the impact of a company on a market be measured? In terms of revenue? Or number of employees? How can you analyze and measure financial contagion (propagation of shocks) within a system of inter-related companies/banks operating in a given market? Is a given company a TBTF (Too Big To Fail) company, with systemic importance in a nation's economy? Complexity provides great tools to help answer similar questions. First of all it is necessary to define the system within which we wish to examine a particular corporation. Suppose that we want to analyze a system composed of five banks which compete/cooperate in a certain market. Using OntoSpace and financial (P&L) statements, which publicly traded companies publish every quarter on the respective websites, one case easily create the global Complexity & Risk Map of the system (market segment, group of companies, etc.). The example shown below is relative to five banks from a certain EU country. Public data, available on the respective websites, has been used for the analysis. Each block of nodes - red or blue corresponds to a bank. The only reason behind this colouring scheme is to allow one to distinguish easily the banks. The order of the variables in the map below is the same as in the published P&L statements. The quantification of the footprint of a company in a given context hinges on two fundamental points:
Analysis of the structure of the corresponding global Complexity & Risk Map. Contribution of the company to the system in terms of complexity. This can be measured as ß = 1 - C/CT, where C is the complexity of the system without the company in question and CT is the total complexity of the system.
As for the map, we may state the following: First of all, the Complexity & Risk Map illustrates the global architecture of all the significant relationships (not correlations!) between the salient financial parameters of the various banks. This already gives a tremendous amount of information as to how the five banks co-exist and how information flows between them. The red and blue circles are the so-called hubs, and correspond to key variables. Hubs have the greates number of relationships in the network. Anything that happens in the hubs (good or bad) Copyright, © 2012 J. Marczyk
45
Practical Complexity Management, Part II
propagates to many variables. However, one important thing to keep in mind is that the relationships in the map below do not imply causality - they simply signify that two variables "move" in synch. For the purpose of this short note, let us concentrate on complexity. Let us look at the first bank and let's quantify its footprint on the system of five banks. With respect to the contribution in terms of complexity we see that in the case in question we have ß = 1 - 36.5/44.3 = 0.18. In other words, the impact of the first bank is approximately 18%. A similar analysis may be performed for the remaining four banks, yielding a complete Complexity Profile of the system. Copyright, © 2012 J. Marczyk
46
Practical Complexity Management, Part II
17.
On Critical Complexity
Very often the following question arises when discussing the evolution of complexity of, say, a corporation: why does the critical complexity change with time? In fact, in almost all systems we analyze we observe substantial changes of not only the current value of complexity but also of the corresponding lower and upper (critical) limits. See, for example, the evolution of the mentioned complexities for the US housing market over a period of four years shown below.
So, why is it that the upper complexity theshold changes? Why would the critical complexity change for a given system? Shouldn't it be a constant characteristic of the system in question? The answer is really simple. Most dynamical systems constantly go through changes, switching from one mode of functioning to another. As they do so, the exhibit a changing structure - not all features of a given system are active and visible at all times. For this reason, the system shows only a part of its full potential and, consequently, appears to be more or less complex. Systems like banks, corporations, societies, traffic systems or markets are highly dynamical and are characterized by a Copyright, © 2012 J. Marczyk
47
Practical Complexity Management, Part II changing structure. It is this changing structure - which is reflected by the corresponding Complexity Map - that is ultimately responsible for the changing upper and lower complexity thresholds. Systems that are stationary and in which structural changes do not take place do indeed have a constant value of both the upper and lower complexity thresholds - see figure at the top left of this article. In the same figure, the system on the right is a "dying" system which progressively loses complexity until it reaches a state of equilibrium. No dynamics, no complexity.
Copyright, © 2012 J. Marczyk
48
Practical Complexity Management, Part II
18.
Do We Really Understand Nature?
According to the Millennium Project the biggest global challenges facing humanity are those illustrated in the image above. The image conveys a holsitic message which some of us already appreciate: everything is connected with everything else. The economy isn't indicated explicitly in the above image but, evidently, it's there, just as the industry, commerce, finance, religions, etc. Indeed a very complex scenario. The point is not to list everything but to merely point out that we live in a highly interconnected and dynamic world. We of course agree with the above picture. As we have repeatedly pointed out in our previous articles, under similar circumstances:
it is impossible to make predictions - in fact, evern the current economi crisis (of planetary proportions) has not been forecast only very rough estimates can be attempted there is no such thing as precision it is impossible to isolate "cause-effect" statements as everything is linked optimization is unjustified - one should seek acceptable solutions, not pursue perfection
The well known Principle of Incompatibility states in fact that "high precision is incompatible with high complexity". However, this fundamental principle, which applies to all facets of human existence, as well as in Nature, goes unnoticed. Neglecting the Principle of Incompatibility iconstitutes a tacit and embarassing admission of ignorance. One such example is that of ratings. While the concept of rating lies at the very heart of our economy, and, from a point of view of principle, it is a necessary concept and tool, something is terribly wrong. A rating, as we know, measures the Probability of Default (PoD). Ratings are stratified according to classes. One example of such classes are shown below: Copyright, © 2012 J. Marczyk
49
Practical Complexity Management, Part II Class
PoD
1
=