Classical mechanics works exceptionally well for describing the behavior of objects that are more or less observable to the naked eye. It accurately predicts the motion of planets as well as the flight of a baseball. It formed the scientific basis for the technology and engineering underlying the Industrial Revolution.
The elegant mathematical models used in classical mechanics depict a world in which objects exhibit deterministic behaviors. The same objects, subject to the same forces, will always yield the same results. These models make perfect prediction within the accuracy of their human-scale measurements.
But, this stable world that could be perfectly described given enough information and scientific knowledge began to fall apart in the early 20th century. Classical mechanics could not explain the counter-intuitive and seemingly absurd behavior of energy and matter at atomic and subatomic scales. Neither could it explain the behavior of bodies traveling near the speed of light or the vast scales of the universe.
To do so, physics had to wait for quantum mechanics and relativity. Compared to the pastoral worlds that classical physics had been traveling through, the new theories that quantum mechanics, special relativity and general relativity were now spinning often had the feel of some sort of 20th century magical mystery tour.
At the atomic and subatomic scales that quantum mechanics deals with, the notion of a predictable, deterministic world went out the window. You cannot specify the path of an electron, because the very notion of path is a classic concept that makes no sense in quantum mechanics. The behavior of particles at this scale is explained by their wave functions, which mathematically describe the complete behavior of the particle, but can only tell you the probability that the particle will be at a specific position and time.
The laws of quantum mechanics, - concisely embodied in Schrödinger’s equation - lets you map out the behavior of electrons and other particles, but only as probability distributions of all the possible states it could be at. In fact, the Heisenberg uncertainty principle tells you that it is impossible to simultaneously determine the exact position and velocity of an electron with any great degree of accuracy no matter how good your measurement tools are. The more precisely you know the position, the less precisely you will know its velocity and where it is heading next. The more precisely you know its velocity, the less you will know about where it is at any particular point of time.
Moreover, while in classical mechanics something either has the properties of a particle, e.g., a planet, a baseball; or of a wave, e.g, light, sound; in quantum mechanics all objects exhibit both kinds of properties. The concept of wave-particle duality explains that it all depends on what question you are asking and what experiment you perform to answer the question. Sometimes a wave model will give you the better answer, sometimes a particle model. There is no such thing as absolute reality. It all depends on the mechanisms used to observe and measure that reality.
But, the worlds of the very small and the very large are not the only ones that exhibit counter-intuitive, somewhat bizarre behavior. So is the world of highly complex systems, especially those systems whose components and interrelationships are themselves quite complex, as is the case with systems and evolutionary biology, as well as with organizational and socio-technical systems whose main components are people. In such systems, the dynamic nature of the components, as well as their intricate interrelationships renders them increasingly unpredictable and accounts for their emergent behavior.
We have long known about complex systems and emergent behavior. The butterfly effect has been used for decades as a metaphor to describe the concept that small variations in a complex, dynamical system may produce large variations in the longer term behavior of the system. “Does the flap of a butterfly’s wings in Brazil set off a tornado in Texas?” was the title of a 1972 talk. Similar phrases are often used around the world to illustrate the difficulties of large scale weather prediction.
The study of complex systems is now receiving a lot more attention for a variety of reasons. As a result of the Internet, we are now dealing with fast changing, globally integrated systems in a number of areas, from finance to supply chains, from the propagation of news items to the marketing of products. Before the Internet brought the world much closer together, changes traveled relatively slowly and often became attenuated and lost intensity before reaching other parts of the system. Not so today. In our increasingly integrated world, changes propagate around the world in milliseconds, and will often lead to non-linear, highly unpredictable behaviors. The Internet has become our ultimate butterfly effect amplifier.
The ongoing financial crisis is a case in point. As we know, this was a gathering storm that we totally missed. While there are many reasons why we failed to predict the crisis, one of the major ones is arguably that for the last several decades economic thinking was dominated by neoclassical economics, which some describe as an “idealized vision of an economy in which rational individuals interact in perfect markets.”
The financial crisis may have done for this idealized view of people and markets what the atomic spectrum and speed of light experiments did for classical mechanics at the end of the 19th century. They have both acted as kinds of turn of the century indicators that the world out there is far more complex than previously imagined, and new theories and models are now needed.
New terms, like long tails, Freakonomics and black swans, - every bit as fanciful as quarks, charm and strangeness, - have begun to enter our lexicon. Black swan theory, for example, refers to the consequences of high impact, hard to predict events beyond the realm of normal expectations, yet capable of significantly altering the course of history, such as our financial meltdown, the 9/11 attacks and the rise of the Internet. These events, while catastrophic when they do happen, seem so far outside the boundaries of normal probability that most models miss them because no one expect them to happen within their lifetimes. But, they do happen, and given our increasingly integrated, fast changing systems, we expect such large improbable disturbances to occur more frequently in the years ahead.
How do you then deal with these highly unpredictable complex systems in our midst? The good news, is that advances in information technologies can be a huge help if properly applied. First of all, you need to know what is going, which requires access to vast quantities of information. Fortunately, we now have the ability to gather huge amounts of information about the real-time behavior of organizations, markets, and other highly complex systems, which we can then quickly analyze with powerful supercomputers. This should help us make better informed and much smarter decisions, including real time corrections to the system to try to prevent or ameliorate whatever problems the analysis might have uncovered.
How about modeling these complex systems, so we can better understand their behavior and try to anticipate future catastrophic conditions? Since these systems are inherently unpredictable, there is no one model that can accurately tell us about the future. But, reminiscent of quantum mechanics, we can map out the future states of the system, and compute the probabilities of the different states at different points in time. We do something similar today with hurricane prediction, where we are not able to tell precisely where the incoming hurricane will hit a few days from now, but we can calculate cones of probabilities reflecting the possible paths the hurricane might take.
Doing so requires massive supercomputers, because we have to simultaneously run thousands of copies of the same complex applications, using many different combination of parameters, so we can explore the solution space of these otherwise unpredictable problems over time. This will let us map out what the overall solution space for these systems looks like, e.g., their cones of probabilities, and in particular, it will enable us to calculate the probabilities of extreme events.
This new style of predictive modeling is one of the key objectives of the Extreme Scale Computing initiative that is being launched by the US Department of Energy (DOE) to address major energy and environmental problems, from climate studies to the design of safe nuclear reactors. But, it is clear that many disciplines will benefit from such powerful predictive capabilities, from economics and medicine to city planning and business management.
I truly find this fascinating. Once more, our classical understanding of the world needs to be expanded to include a set of more fanciful, somewhat counter-intuitive, seemingly absurd explanations for events we cannot otherwise grasp. Our 21st century magical mystery tour promises to be every bit as exciting and productive as the one we embarked on over a century ago.