The November issue of the MIT Technology Review featured a provocative article, Why We Can’t Solve Big Problems, written by its editor in chief Jason Pontin. The article focuses on a feeling, - commonplace in Silicon Valley, - that since the Apollo program that put a man in the moon on July 21, 1969, something may have happened to humanity’s capacity to solve big problems. The article reminds me how much our society has changed in the last twenty years, in particular, the different way we solved big problems at the zenith of our 20th century industrial economy versus the way we now do so in the early stages of the 21st century information economy.
The Apollo program was launched at the height of the Cold War. In April of 1961, Soviet cosmonaut Yuri Gagarin became the first person to fly in space, reinforcing the fears that the US was falling behind the technical and military competition with the Soviet Union. A month later, in a speech before a join session of Congress, President John F. Kennedy issued a challenge to the nation:
“I believe that this nation should commit itself to achieving the goal, before this decade is out, of landing a man on the Moon and returning him safely to the Earth. No single space project in this period will be more impressive to mankind, or more important in the long-range exploration of space; and none will be so difficult or expensive to accomplish.”
“This required the greatest peacetime mobilization in the nation’s history,” writes Pontin. “Although NASA was and remains a civilian agency, the Apollo program was possible only because it was a lavishly funded, semi-militarized project . . . In all, NASA spent $24 billion, or about $180 billion in today's dollars, on Apollo; at its peak in the mid-1960s, the agency enjoyed more than 4 percent of the federal budget. The program employed around 400,000 people and demanded the collaboration of about 20,000 companies, universities, and government agencies.”
President Kennedy further explained his challenge to the nation in a 1962 speech at Rice University:
“But why, some say, the moon? Why choose this as our goal? . . . Why climb the highest mountain? Why, 35 years ago, fly the Atlantic? . . . We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard; because that goal will serve to organize and measure the best of our energies and skills . . .”
No human has been back to the moon since Apollo 17 in 1972. And, there are no plans to return to the moon any time soon, let alone to send people to Mars. According to Pontin, our loss of appetite for space travel may be an indicator of something deeper:
“Blithe optimism about technology’s powers has evaporated, too, as big problems that people had imagined technology would solve, such as hunger, poverty, malaria, climate change, cancer, and the diseases of old age, have come to seem intractably hard”
“What happened?,” he asks. “That something happened to humanity’s capacity to solve big problems is a commonplace. Recently, however, the complaint has developed a new stridency among Silicon Valley's investors and entrepreneurs, although it is usually expressed a little differently: people say there is a paucity of real innovations. Instead, they worry, technologists have diverted us and enriched themselves with trivial toys.”
“We wanted flying cars - instead we got 140 characters,” is how PayPal cofounder Peter Thiel has succinctly described his belief that we are no longer solving big problems. The Internet is “a net plus - but not a big one,” Thiel told the New Yorker last year.
The Apollo program represents the kind of highly complex and expensive engineering projects that the US undertook during WWII and the Cold War. It is hard to imagine doing so again absent similar existential threats to the nation. We have not seen such large government programs since the Strategic Defense Initiative, aka the Star Wars program toward the end of the Cold War in the 1980s, although one could argue that the rebuilding of New Orleans’ levee system following hurricane Katrina is such a large government program.
But something else, at least as complex and challenging in my opinion, has been taking place since then.
There is little question that our economy, and just about all institutions of society are going through major structural changes, mostly driven by the digital technology revolution that’s all around us. And in particular, since the explosive growth of the Internet and World Wide Web in the mid 90’s, we have been transitioning from the industrial society of the past couple of centuries to a new kind of information society and knowledge-based economy.
No matter how you look at it, such a transition is a very big deal. There have only been a few such major economic shifts in human civilization. Early humans were hunter-gatherers and lived in nomadic groups. Then roughy 10,000 years ago, the development of agriculture started, including many kinds of crops and domesticated animals or livestock. Agriculture was the key to the rise of sedentary human communities, and led to agrarian societies. More and more people settled in communities which over time grew into villages, towns, cities and nations.
The transition to an industrial society started in the 18th century, largely driven by advances in science and technology. These advances in turn led to mass production and the division of labor; large, increasingly urban populations; and significantly higher health, life spans, and standards of living in most countries in the world.
Then, following the Cold War and the advent of the Internet we started our transition to an information society.
Are we solving big problems and coming up with major innovations as part of such a transition? I believe we are, although the nature of the problems and innovations are very different from those represented by the Apollo program and similar top down, 20th century government programs. While it may feel like I’m comparing apples and oranges, the comparison helps to illustrate the different ways we framed and solved big complex problems in the industrial economy to the way we are now doing so in our emerging information economy.
The most obvious difference is that the Apollo program and similar such big programs, - e.g., radar, atomic energy, satellites, computers, the Internet, - were directly associated with WWII and the Cold War, and were mostly funded by the Department of Defense, the Department of Energy, NASA, and other Federal agencies. These were hierarchically organized, massive engineering projects, which can be viewed as the culmination of the kinds of big problem projects of the industrial economy.
The organizational dynamics underlying the transition to an information economy are very different from those of the Apollo program. The projects are much broader in scope, involving large segments of the US economy as well as those around the world, more like a series of loosely related initiatives building on each other over decades. And, they are far more collaborative, involving large and small companies as well as many startups; universities and research labs all over the world; and local and state governments, in addition to national governments.
The Internet is the overriding catalyst for this historical transition to an information economy. And, there is little question that without the many contributions of DARPA and other Federal government agencies, the Internet would likely not have happened. But once its commercialization started in the mid 1990s, market forces kicked in with a vengeance, providing most of the investments and innovation powering this transition.
Moreover, the very nature of innovation has changed. For most of the 20th century, new technologies were first deployed in large institutions - enterprises, universities and research labs, the military and civilian government agencies, - from which they eventually trickled down to the rest of the economy as their prices and complexity dropped. Forty years ago, for example, the computer industry consisted primarily of expensive mainframes and supercomputers that only larger institutions could afford to buy and operate. Minicomputers and personal computers then followed over the next couple of decades making computers increasingly affordable.
But, this trickle-down approach to technology diffusion dramatically reversed itself with the advent of inexpensive technologies, especially digital technologies. As a result of the incredible advances in digital components and personal computing over the past thirty years, many new innovations are first emerging in consumer markets and among communities of experimental users, from which they eventually trickle-up into the worlds of business and government. Besides personal computing, we have seen such consumer and user-driven innovations in smartphones and other mobile devices, game players, Linux, social media, cloud computing and so on.
I believe that the kind of extensive collaboration between the private sector, academia and government represented by the Internet revolution will be the way we will generally tackle big problems in the 21st century. Just as with the Internet, governments have a major role to play as the catalyst for many of the big projects that the private sector will then take forward and exploit. The need for high bandwidth, robust national broadband infrastructures is but one such example.
The transition to an information society is extremely challenging at many levels, including redefining the proper balance between markets and government, that is, with what markets can accomplish on their own and where government initiatives are needed. I remain optimistic that we will work things out over time, and we will continue to see the kind of big problem collaboration that is now taking us into a 21st century information society.