I recently participated in MIT’s 2010 Systems Thinking Conference for Contemporary Challenges. This annual conference is sponsored by Systems Design and Management (SDM) - an interdisciplinary program between MIT’s School of Engineering and Sloan School of Management. The SDM program aims to provide mid-career professionals with a systems perspective that will help them address and solve large-scale, global, complex challenges. Most of the students in the classes I have taught at MIT are enrolled in SDM.
The Systems Thinking Conference is intended to showcase innovative ways of addressing complex challenges, in particular, how to integrate technology, management and social sciences to successfully solve very tough problems. The agenda included sessions on sustainability and urban systems, health care and services. The talks covered a wide range of subjects, from the global development of Boeing’s 787 to the home of the future. I was co-organizer of the session on service systems, gave a talk on Technology and Innovation in the Service Economy and moderated a panel with the other speakers in the session.
John Sterman - professor in MIT’s Sloan School of Management and director of the Systems Dynamics Group - gave a very interesting talk: A Banquet of Consequences: Systems Thinking and Modeling for Climate Policy. The title of his talk is based on a quote by Scottish writer Robert Louis Stevenson: Everybody, soon or late, sits down to a banquet of consequences.
The study of complex systems is truly fascinating, especially those systems whose components and interrelationships are themselves quite complex, as is the case with evolutionary biology and with organizational systems, whose main components or actors are people. In such systems, the dynamic nature of the components, as well as their intricate interrelationships renders them increasingly unpredictable and accounts for their emergent behavior.
Sterman said that when he meets with business or government leaders they generally tell him that in spite of all the tools, methods, information and models we now have at our disposal, it seems that things around us are getting more difficult to understand and manage. Our own actions and decisions are not helping and perhaps they are making things worse.
This is consistent with the findings of IBM’s 2010 Global CEO Study, where complexity emerged as the overriding challenge facing the 1500 CEOs and senior public leaders interviewed. They said that they are now operating in a world that is substantially more volatile, uncertain and complex, and where incremental changes are no longer sufficient because the world is behaving in fundamentally different ways.
Why do people - even highly educated, experienced and accomplished leaders in business and government - find dealing with dynamic, complex systems so challenging? Why is it so difficult to make sound decisions about highly complex problems? Why are we so often surprised by the unanticipated, negative (sometimes disastrously so) consequences of our actions and decisions involving such systems?
Professor Sterman has been addressing these questions in his research and in his extensive publications. As he wrote in Learning from Evidence in a Complex World:
“Complexity hinders our ability to discover the delayed and distal impacts of interventions, generating unintended “side effects.” Yet learning often fails even when strong evidence is available: common mental models lead to erroneous but self-confirming inferences, allowing harmful beliefs and behaviors to persist and undermining implementation of beneficial policies.”
He believes that unanticipated events and side effects are not features of reality in complex systems, but a result of overly simplistic, incomplete models:
“We have been trained to view our situation as the result of forces outside ourselves, forces largely unpredictable and uncontrollable. Consider the “unanticipated events” and “side effects” so often invoked to explain policy failure. Political leaders blame recession on corporate fraud or terrorism. Managers blame bankruptcy on events outside their organizations and (they want us to believe) outside their control. But there are no side effects - just effects. Those we expected or that prove beneficial we call the main effects and claim credit. Those that undercut our policies and cause harm we claim to be side effects, hoping to excuse the failure of our intervention. “Side effects” are not a feature of reality, but a sign that the boundaries of our mental models are too narrow, our time horizons too short.”
We do best when dealing with simple, sequential mental models. We identify the problem, gather data, evaluate alternatives, select a solution and proceed to implement. These models work quite well in situations with short time horizons and narrow boundaries, such as trying to decide how to safely cross a busy street or (a long time ago) whether to confront or avoid a potential predator. Most of the decisions and actions in our everyday life are based on such simple models. The ability to quickly learn them is likely wired into our brains.
But, these simple models are of limited value when dealing with dynamic, complex systems, such as you find in areas from health care to financial markets to climate change. We make the wrong decisions and get in trouble when there is a large gap between the complexity of the real problems we are trying to address and our simple mental picture of the problem. “. . . Where the world is dynamic, evolving, and interconnected, we tend to make decisions using mental models that are static, narrow, and reductionist. Among the elements of dynamic complexity people find most problematic are feedback, time delays, and stocks and flows.”
With a simple, sequential model, once we make a decision and take an action, we are done. But, that does not work with complex, interconnected systems, where our actions have consequences and feed back on themselves. The world reacts to and is changed by our interventions.
“Our decisions alter the state of the world, causing changes in nature and triggering others to act, thus giving rise to a new situation, which then influences our next decisions. Like organisms, social systems contain intricate networks of feedback processes, both self-reinforcing (positive) and self-correcting (negative) loops. However, studies show that people recognize few feedbacks; rather, people usually think in short, causal chains, tend to assume each effect has a single cause, and often cease their search for explanations when the first sufficient cause is found.”
Feedback loops are easier to deal with when our actions or decisions cause a quick reaction. We can then take corrective actions to help us attain the desired state. Riding a bicycle is an obvious example. But dealing with the impact of our actions is much harder when there are time delays involved. “Most obviously, delays slow the accumulation of evidence. More problematic, the short- and long-run impacts of our policies are often different (smoking gives immediate pleasure, while lung cancer develops over decades).”
“Delays also create instability and fluctuations that confound our ability to learn. Driving a car, drinking alcohol, and building a new semiconductor plant all involve time delays between the initiation of a control action (accelerating/braking, deciding to “have another,” the decision to build) and its effects on the state of the system. As a result, decision makers often continue to intervene to correct apparent discrepancies between the desired and actual state of the system even after sufficient corrective actions have been taken to restore equilibrium. The result is overshoot and oscillation: stop-and-go traffic, drunkenness, and high-tech boom and bust cycles.”
Sterman’s seminar at the MIT Systems Thinking Conference focused on climate change policy and the widespread misunderstanding of the stocks and flows which underlie climate science.
Stocks are accumulations, - like water filling a bathtub. The difference between the inflows and outflows of a stock, govern its level at any point in time as well as its rate of accumulation. If water is pouring into (inflow) the bathtub faster than it’s draining (outflow), the tub will fill up over time and eventually spill over. If water is pouring in slower than it’s draining, the tub will eventially empty out. This is intuitively obvious to most people.
You can think of the atmosphere as a giant carbon bathtub. “It’s simple, really: As long as we pour CO2 into the atmosphere faster than nature drains it out, the planet warms. And that extra carbon takes a long time to drain out of the tub.” But in his research Sterman has found that a fundamental human cognitive limitation is impeding action on global warming. He has documented this problem in human reasoning by testing graduate students at his MIT classes. He found that “his students, though very bright and schooled in calculus, lack an intuitive grasp of a simple, crucial system: a bathtub.”
Research shows widespread misunderstanding of stocks and flows and of the process of accumulation. People often fail to grasp that any stock rises (falls) when the inflow exceeds (is less than) the outflow. “Most people assume that system inputs and outputs are correlated (e.g., the higher the federal budget deficit, the greater the national debt will be). However, stocks integrate (accumulate) their net inflows. A stock rises even as its net inflow falls, as long as the net inflow is positive: the national debt rises even as the deficit falls - debt falls only when the government runs a surplus; the number of people living with HIV continues to rise even as incidence falls - prevalence falls only when infection falls below mortality.”
“Poor understanding of accumulation has significant consequences for public health and economic welfare. Surveys show most Americans believe climate change poses serious risks, but they also believe that reductions in GHG [greenhouse gases] emissions sufficient to stabilize atmospheric GHG concentrations can be deferred until there is greater evidence that climate change is harmful. Federal policy makers likewise argue that it is prudent to wait and see whether climate change will cause substantial economic harm before undertaking policies to reduce emissions. Such wait-and-see policies erroneously presume that climate change can be reversed quickly should harm become evident, underestimating immense delays in the climate’s response to GHG emissions. Emissions are now about twice the rate at which natural processes remove GHGs from the atmosphere. GHG concentrations will therefore continue to rise even if emissions fall, stabilizing only when emissions equal removal.”
The MIT Systems Thinking Conference gives us an excellent overview of the many complex challenges we face in business, government and society. It also shows how a systems-based approach can be used to address the urgent and highly complex problems facing today’s world. We all have a lot to learn, but I came away with the feeling that given the proper collaboration between academia, industry and government, this is an area that is ripe for innovation and where we will make rapid progress in the future.