Last month I attended AI and the Future of Work, a conference hosted by MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and its Initiative on the Digital Economy (IDE). The two-day agenda included over 20 keynotes and panels on AI-related topics, and around 60 speakers and panelists from academia and business.
New technologies have been replacing workers and transforming economies over the past two centuries. But, over time, these same technologies led to the creation of whole new industries and new jobs. While the technologies of the industrial economy helped to make up for our physical limitations, the technologies of the digital economy are now enhancing our cognitive capabilities. They’re being increasingly applied to activities that not long ago were viewed as the exclusive domain of humans. Will the AI revolution play out like past technology revolutions, - short term disruptions followed by long-term benefits, - or will this time be different?
Conference participants generally agreed that AI will have a major impact on jobs and the very nature of work. But, for the most part, they viewed AI as mostly augmenting rather than replacing human capabilities, automating the more routine parts of a job and increasing the productivity and quality of workers, so they can focus on those aspect of the job that most require human attention. Overall, a small percentage of jobs will be fully automated, while many more will be significantly transformed.
Conference participants also generally agreed that the more advanced AI-based transformations will not happen rapidly, but are likely decades away. Much progress has been recently made in the ability to extract features from all the data we now have access to, as well as in machine learning algorithms that give computers the ability to learn by ingesting large amounts of data instead of being explicitly programmed. While such statistical pattern recognition approaches can be applied to many tasks, they’re no substitute for model formation, the main approach used by humans, - from toddlers to physicists, - to understand how the world works. We’re a long way from the development of AIs that truly learn and reason like people.
Let me briefly discuss a few of the sessions at the MIT conference.
There was a very interesting panel on AI’s Implications for Productivity, Wages and Employment, comprised of some of the top economists doing research on the subject: MIT’s Daron Acemoglu and Erik Brynjolfsson and Northwestern’s Robert Gordon and Joel Mokyr. Their overall conclusion was that AI will both create and destroy jobs, as has been the case with previous technologies, but is unlikely to lead to a significant reduction of jobs.
Professor Gordon said that “no invention in the 250 years since the first industrial revolution has caused mass unemployment, and that though jobs are constantly being destroyed, they are also being created in even larger numbers,” wrote Michael Miller in one of his PCMag.com columns covering the event.
Gordon added that “there is enormous churn in the job market, and that at present there is actually a shortage of workers, not a shortage of jobs, which is true even in fields such as construction, skilled manufacturing, and long-distance truck driving… To sum up, he said that it’s very easy to predict the jobs that will be destroyed, but much more difficult to anticipate the new jobs that will be made possible. Looking ahead 20 years, Gordon said AI will displace some jobs, adding to labor market churn. But, in terms of its effect on jobs, AI is nothing new.”
His Northwestern colleague Joel Mokyr pretty much agreed with Gordon’s conclusions. “Mokyr, however, believes technology will not only continue to change, but that this change will accelerate, while Gordon's thesis has been that today’s technology isn’t as impactful as technology from previous periods, such as electrification.” Mokyr noted that instead of the technological unemployment predicted in 1930 by English economist John Maynard Keynes, we’re seeing the growth of services, the development of many new kinds of goods, as well as slow but relentless productivity growth. While it’s hard to anticipate which new jobs will exist in the future, “the demographics make it likely that there will be more jobs that involve caring for an aging population, and less that involve caring for children… In addition, he said, there may be more creative jobs, and we should never underestimate tacit knowledge - intuition, instinct, and imagination - which are not qualities we associate with machines. Still, he noted, the transition won’t be painless.”
Acemoglu observed that “we’ve had new tasks and new occupations throughout history. But while he said this typically ends well for society as a whole, there can be hardships for specific classes of workers, and sometimes for decades. He said there was effectively no increase in wages during the industrial revolution, but said that institutional structure and education can affect this.”
Brynjolfsson explained that AI is a General-Purpose Technology (GPT), noting “that such technologies may actually lower stated productivity up front as companies invest in these without seeing a return, which comes later… In general, he said GPTs require time-consuming complementary innovation and investment, and that to keep up with accelerating technology in order to realize the benefits of AI, we will probably need to reinvent our organizations, institutions, and metrics.”
He later added that “while every moment is different, history suggests that eventually things work out, as both Gordon and Mokyr had suggested. But he also noted that there have been long periods during which people didn't do so well, because of technological changes in employment. Read history or Dickens, he said.”
The most intriguing session for me was a fireside chat with Bit Source co-founder Rusty Justice. Justice has worked in the coal industry all his life, and calls himself “an unapologetic hillbilly [who] loves Jesus, his family, baseball, and all things Appalachia [and] appreciates the opportunity to play a small part in establishing a rural tech company in the mountains.”
Bit Source is an application development and web design company, dedicated to training former coal miners in Eastern Kentucky to become software developers. To Justice, “coding is a trade like welding, not computer science,” and after 22 weeks of training individuals were producing professional quality code. “Instead of coal, we export code.”
In a 2016 NPR segment, From Coal to Code: A New Path for Laid-Off Miners in Kentucky, Justice said: “The realization I had was that the coal miner, although we think of him as a person who gets dirty and works with his hands, really coal mines today are very sophisticated, and they use a lot of technology, a lot of robotics.”
In his opening remarks, MIT President Rafael Reif said that while it’s clear to almost everyone that deep change is happening, for most people it’s not clear how to respond. He later wrote in a Boston Globe op-ed that “many fear that this time the change may be so fast and so vast, and its impact so uneven and disruptive, that it may threaten not only individual livelihoods, but the stability of society itself… If we want the advance of technology to benefit everyone, however, we need to take action right away: We must proactively and thoughtfully reinvent the future of work.”
Comments