As the US recovers from the financial crisis, economic policies that encourage capital investment and job creation must be our top near-term priorities. But, to achieve long term economic growth, as well as gains in our standard of living, the US must also focus on innovation and productivity growth.
US labor productivity - “the amount of goods and services that a laborer produces in a given amount of time” - grew at only 1.5% between 1973 and 1995. This period of slow productivity coincided with the rapid growth in the use of computers in business, giving rise to the Solow productivity paradox, in reference to Robert Solow's 1987 quip: “You can see the computer age everywhere but in the productivity statistics.”
But, starting in the mid 1990s, US labor productivity surged to over 2.5%. In an analysis of US labor productivity over the previous fifty years, Harvard economist Dale Jorgenson and his collaborators concluded that “the pessimism of the computer productivity paradox gave way to near-universal belief in a productivity resurgence led by information technology in the late 1990s.”
This IT-based productivity resurgence has continued, except for a short period that reflects the declining IT investment right after the bursting of the dot-com bubble, and the maturing of the global economic expansion in 2005 and 2006. Even as economic growth has lagged in 2009 and 2010, productivity growth has remained robust.
This productivity growth was led by IT producing companies during the dot-com bubble years of the 1990s, e.g., developers of semiconductors, hardware and software. But since 2001, the productivity growth has been led by IT-using industries, especially those industries that are intensive users of IT, like retail trade, scientific and technical services and financial services. Given their fast declining prices, these companies have been able to acquire and deploy significantly more information technologies across all their operations.
In a recently published book, Wired for Innovation: How Information Technology is Reshaping the Economy, Erik Brynjolfsson and Adam Saunders explore the links between the current wave of IT-based business innovation and the productivity resurgence in the U.S. economy. Erik Brynjolfsson is professor in MIT’s Sloan School of Management and Director of the MIT Center for Digital Business. Adam Saunders is lecturer at the Wharton School of the University of Pennsylvania, as well as a doctorate candidate at MIT’s Sloan School.
“Although some say that technology has matured and become commoditized in business, we see the technological “revolution” as just beginning. Our reading of the evidence suggests that the strategic value of technology to business is still increasing. For example, since the mid 1990s there has been a dramatic widening in the disparity in profits between the leading and lagging firms in industries that use technology intensively (as opposed to producing technology). Non-IT intensive industries have not seen a comparable widening of the performance gap - an indication that deployment of technology can be an important differentiator of firms’ strategies and their degree of success.”
But while information technology has been responsible for most of the resurgence in US productivity since 1995, it was not the sole reason for the increased growth. As Brynjolfsson and Saunders observe:
“The companies with the highest returns on their technology investments did more than just buy technology; they invested in organizational capital to become digital organizations. Productivity studies at both the firm level and the establishment (or plant) level during the period 1995-2008 reveal that the firms that saw high returns on their technology investments were the same firms that adopted certain productivity-enhancing business practices. The literature points to incentive systems, training and decentralized decision making as some of the practices most complementary to technology.”
What are the key elements of organizational capital? As part of their research, Erik and his colleagues at the MIT Center for Digital Business studied 1167 large firms over ten years and analyzed the links between IT, productivity and organizational capital. Their analysis identified the key management practices of those companies that realized the most business value from their IT investments:
- move from analog to digital processes
- open information access
- empower the employees
- use performance-based incentives
- invest in corporate culture
- recruit the right people
- invest in human capital.
I really like the concept of organizational capital as the necessary critical ingredient to enable a company to take full advantage of disruptive advances in technology. It provides a simple explanation for the productivity paradox, that is, why it took several decades for the technological advances of the IT industry to be translated into measurable business productivity. First, IT had to advance sufficiently to become truly ubiquitous and standardized, which only happened since the advent of the Internet and World Wide Web in the mid to late 1990s. But, as Brynjolfsson and Saunders pointed out, technology advances were not enough.
In its first few decades, companies used IT primarily to automate existing processes, especially the more highly repetitive and standardized tasks that were amenable to computer applications. It wasn’t until the 1990s, with the pioneering work of Michael Hammer and others on business process reengineering, that people started to realize that using technology to automate existing processes wasn’t enough. Rather, it was time for organizations to fundamentally rethink their operations, redesign the flow of work in their companies and eliminate processes that did not add value to the fundamental objectives of the business.
It has taken us quite a while to learn how to apply technology to reengineer and redesign organizations and their processes, so that they are not only more efficient and productive, but also more flexible and able to adapt to the continuing changes in technologies and markets. We are in the early stages of significantly transforming the very nature of companies to better take advantage of the new possibilities offered by IT. As Wired for Innovation observes, the digital technological revolution is just beginning, as companies are just learning how to become effective digital organizations.
Such a decades long time lag between the advent of a disruptive technology, and the realization of its economic promise is not exclusive to IT. In Technological Revolutions and Financial Capital: The Dynamics of Bubbles and Golden Ages, Carlota Perez, - who is affiliated with the universities of Cambridge and Sussex in the UK, - wrote that over the past 200 years we have had a major technology revolution every 40 - 60 years, the present IT revolution being the 5th in that span.
Each such technology revolution is characterized by two different periods, each lasting 20 - 30 years. The installation period is the time when new technologies emerge from the lab into the marketplace, entrepreneurs start many new businesses based on these new technologies, and venture capitalists encourage experimentation with new business models and speculation in new money-making schemes. Inevitably, this all leads to the kind of financial bubble and crash we are all quite familiar with from our recent experiences.
After the crash, comes the deployment period, which Perez views as a time of institutional recomposition. The now well accepted technologies and economic paradigms become the norm; infrastructures and industries start getting better defined and more stable; and production capital drives long-term growth and expansion by spreading and multiplying the successful business models. Moreover, we now have the right combination for productivity growth: a ubiquitous, standardized, commoditized technology, and the organizational capital to deploy the technology throughout business and society.
A concrete example of these ideas can be found in, The Cloudy Future of Corporate IT, a blog by Andrew McAfee, who is a research associate at MIT’s Center for Digital Business. His blog looks at US manufacturing as it switched from steam to electric power, a process that took 50 years to complete. “Anyone hoping to understand the digitization of business will learn a lot of lessons from the electrification of American manufacturing,” he writes.
Each major technology revolution, from steam power to electricity, has resulted in major productivity increases, as it rejuvenated and transformed the whole economy and ultimately re-shaped social behavior and just about all institutions of society. The lessons from history are that in order to achieve economic productivity and societal benefits, you need the advances in technology as well as the organizational capital to put the technology to work. As information technologies are now entering their own deployment phase, we can look forward to similar economic productivity and societal benefits. The evidence shows that despite the tough economic times we have been going through, this is beginning to happen, and we can expect it to continue for many years to come.
Interesting article. But what form does "organizational capital" take? How do we know it when we see it? It seems to me that most organizations have the same ways of working that they have had for years if not decades.
Some of the increases in productivity could have come from increased use of outsourcing, cheaper labor in overseas markets, etc.
It seems to me that we will see the true effects of IT in the form of large amounts of deflation as these scalable technologies demonstrate their power. But there are many factors that may have served to cover up those signs of deflation -- at least so far.
Posted by: Tom Foremski | July 21, 2010 at 08:37 PM