For a long time, scientists and science-fiction writers alike have pursued the question whether you can accurately predict the future from the past given sufficiently large groups, historical information and computational power.
For example, in the Foundation Series, one of science-fiction's classics of all times, Isaac Asimov introduced the fictional scientific concept of Psychohistory. The essential idea in psychohistory, says the Wikipedia entry, is that “while one cannot foresee the actions of a particular individual, the laws of statistics as applied to large groups of people could predict the general flow of future events. Asimov used the analogy of a gas: an observer has great difficulty in predicting the motion of a single molecule in a gas, but can predict the mass action of the gas to a high level of accuracy.”
Psychohistory relied on extremely advanced models which enabled a purely mathematic approach to predict future events. However, it was not able to cope with totally unpredictable events. Thus, when a mutant was born, the fictional character known as the Mule, his totally unanticipated psychic skills invalidated the assumptions underlying the models, and history then proceeded on a very different course that what had been predicted by the psychohistory computers.
Asimov and his wonderful Foundation story have been on my mind in the last few years as I have been thinking about complex organizational systems, where people and the services they perform for each other are the primary components. In particular, I have wondered about the limits in our ability to accurately model and predict the behavior of such people-based systems, especially as I continue to reflect on the causes of the global financial crisis – our financial systems being an example of such complex, people-based organizational systems. Will the mutations, highly irrational behaviors and other unpredictable events inherent in such systems always befuddle mathematical models and computers, no matter how sophisticated and powerful we make them.
I recently came across a very good article on the subject: Risk Mismanagement by business columnist Joe Nocera in the January 2 issue of the New York Times Magazine. In it, Nocera explores whether the financial meltdown was primarily caused by errors in the mathematical models used to evaluate trades and risk, or by bankers and regulators who misread or ignored the models. Is the crisis mostly a failure of risk management or a failure of management?
The article prominently features Nassim Taleb, author of The Black Swan: The Impact of the Highly Improbable. Taleb uses the concept of the the highly rare Black Swans as a metaphor for high impact, hard-to-predict events beyond the realm of normal expectations. Black Swans are events that can totally change the course of history, sort of like the unexpected appearance of a mutant with psychic powers in Asimov’s Foundation, except that they occur far, far more often. As examples of such Black Swan events he cites the rise of the Internet, the personal computer, World War I, the 9/11 attacks, and our ongoing financial meltdown.
Explaining Taleb’s views, Nocera writes that “the greatest risks are never the ones you can see and measure, but the ones you can’t see and therefore can never measure. The ones that seem so far outside the boundary of normal probability that you can’t imagine they could happen in your lifetime – even though, of course, they do happen, more often than you care to realize. Devastating hurricanes happen. Earthquakes happen. And once in a great while, huge financial catastrophes happen. Catastrophes that risk models somehow always manage to miss.”
Models based on analyzing historical data are very good at accurately measuring the risk in a portfolio under normal market conditions, the kinds of markets that explain 99 percent of events and follow the familiar bell curve of a Gaussian or normal distribution. But, every so often, say one percent of the time, improbable events happen that are way outside a normal distribution. Such market events are totally unpredictable, that is, the future could not have been predicted based on past behavior, because the improbable event is something that has rarely, if ever, happened before.
How do you deal with such an improbable event? That’s where human judgment comes in, especially the judgment of experienced managers that have been around for a while and can intuitively feel when something does not quite feel right. The best thing to do at such times is to get together with other experts whose opinions you trust, and see if they share your misgivings. It is good to assemble people with a diverse set of skills and opinions. Where someone may see trouble brewing, someone else may see a casual disturbance that will work itself out. If needed, you can always bring outside experts and add their views to the mix.
Like meteorologists tracking a disturbance in the North Atlantic during hurricanes season, you want to leverage advanced technologies and real-time information to help figure out what's going on. You should run a variety of models to see how the disturbance might evolve under different scenarios. You should keep re-running the models as new information comes in. It typically takes a while to figure out whether a disturbance is a false positive, just part of the expected ups and downs of a normal market, or whether it is a sign that something much more serious is taking place that calls for extraordinary actions.
Mathematical models do a very good job with normal events that is, those events that fall within the 99 percent probabilities of Gaussian curves. But those models are pretty useless when it comes to predicting what happens with the other one percent of events at the extreme edge of the curve. The models are not able to tell you whether that one percent is something fairly mild that happens a few times a year, or whether a true black swan is lurking that will cause losses in the billions and billions and threaten to bring down companies, nations and perhaps the overall global financial system.
The issue is not whether the best decisions are based on quantification and numbers, or whether given an uncertain future, decisions should be based on the intuitive beliefs and experience of human. You need both. It all depends. The real danger comes if we rely too much on the mathematical models that work well most of the time, and put too much trust on technically brilliant quants who lack the judgment that comes from experience, especially from having personally seen what a black swan looks like and what it can do to you and your firm.
“Because we don’t know what a black swan might look like or when it might appear and therefore don’t plan for it, it will always get us in the end,” writes Nocera. “Any system susceptible to a black swan will eventually blow up,” Taleb says. The modern system of world finance, complex and interrelated and opaque, where what happened yesterday can and does affect what happens tomorrow, and where one wrong tug of the thread can cause it all to unravel, is just such a system.”
As our world becomes increasingly integrated, fast changing and unpredictable, we expect large improbably disturbances – black swans – to occur more frequently, not only in finance but across business, government and society in general. Mathematical models, information analysis and fast computers will continue to be extremely valuable tools, critical to the smooth functioning of our complex systems. But, when the going gets really rough, no machine or model can ever make up for the wisdom that only comes from human judgment and experience.
