I recently read The Next Wave: A Conversation with John Markoff in Edge.org. Markoff has been a science and technology writer at the NY Times since 1988, as well as author and co-author of several books, including the just published Machines of Loving Grace: The Quest for Common Ground Between Humans and Robots. I can personally attest to his deep understanding of technology and the IT industry, given the many discussions we’ve had over the years.
In The Next Wave, Markoff talks about a wide variety of topics, but I’d like to focus my attention on what I think is the main thread throughout the report, the state of Moore’s Law, - the observation that the number of transistors in an integrated circuit has been doubling approximately every two years.
He believes that Silicon Valley, - where he grew up and has long lived, - has been fundamentally about Moore’s Law. And Moore’s Law has played a central role in his career since becoming a technology reporter in 1977. But then, “I suddenly discovered it was over.”
As a result of Moore’s Law, the costs of computing have been falling at an exponentially accelerating rate for almost five decades. But for the past two years, we seem to be at a plateau as the price per transistor has stopped falling. “I see evidence of that slowdown everywhere. The belief system of Silicon Valley doesn’t take that into account.”
In his 2005 book, The Singularity is Near: When Humans Transcend Biology, - author and inventor Ray Kurzweil predicted that exponential advances in technology, would lead to what he calls The Law of Accelerating Returns. As a result, around 2045 we will reach the Singularity, at which time “machine intelligence will be infinitely more powerful than all human intelligence combined.”
“I simply don’t see it,” says Markoff. AI has indeed made remarkable progress in some areas. “Machines, for the first time are learning how to recognize objects; they’re learning how to understand scenes, how to recognize the human voice, how to understand human language.”
But, the progress is uneven. “What hasn’t happened is the other part of the AI problem, which is called cognition. We haven’t made any breakthroughs in planning and thinking, so it’s not clear that you’ll be able to turn these machines loose in the environment to be waiters or flip hamburgers or do all the things that human beings do as quickly as we think.”
He covered the finals of DARPA’s robotics challenge competition, which took place this past June in the Los Angeles area. 23 teams competed for the $2 million prize. Human operators guided their specially designed robots via wireless networks over a difficult course, where they had to perform 8 tasks relevant to responding to a major disaster.
“It was quite an event. It was a spectacle. They built these by and large Terminator-style machines, and the idea was that they would be able to work in a Fukushima-like environment. Only three of the machines, after these teams worked on them for eighteen months, were able to even complete the tasks. The winning team completed the tasks in about forty-five minutes. They had an hour to do eight tasks that you and I could do in about five minutes. They had to drive the vehicle, they had to go through a door, they had to turn a crank, they had to throw a switch, they had to walk over a rubble pile, and then they had to climb stairs…”
“Most of the robots failed at the second task, which was opening the door. Rod Brooks, who’s this pioneering roboticist, came down to watch and commented on it afterwards because he’d seen all these robots struggling to get the door open and said, ‘If you’re worried about the Terminator, just keep your door closed.’ We're at that stage, where our expectations have outrun the reality of the technology.”
So, if Moore’s Law is at a plateau, and as a result, Silicon Valley might be at the end of the phase that’s been driving its progress since the 1970s, what’s next? And, what then should be next for a Silicon Valley-based technology journalist like Markoff? “I've started to look around for other things that are interesting, that are on the edge, if you will,” he writes.
“Once upon a time, the center of Silicon Valley was in Santa Clara. Now it’s moved fifty miles north, and the current center of Silicon Valley by current investment is at the foot of Potrero Hill in San Francisco. Living in San Francisco, you see that. Manufacturing, which is what Silicon Valley once was, has largely moved to Asia. Now it’s this marketing and design center. It’s a very different beast than it was.”
“What worries me about the future of Silicon Valley, is [its] one-dimensionality, that it’s not a Renaissance culture… It’s an engineering culture that believes that it’s revolutionary, but it’s actually not that revolutionary. The Valley has, for a long time, mined a couple of big ideas.”
The first big idea was personal computing, set in motion by Doug Engelbart and later refined by Alan Kay. Then a decade later Mark Weiser came up with ubiquitous computing, the “profound idea that computing would disappear into everyday objects, and everyday objects would become magic… The first guy to understand that and take advantage of it was Steve Jobs. Steve Jobs first turned the record player into an iPod, and then he turned the telephone into a computer.”
“I’m fascinated to see what the next platform is going to be. It’s totally up in the air, and I think that some form of augmented reality is possible and real. Is it going to be a science-fiction utopia or a science-fiction nightmare? It’s going to be a little bit of both.”
Technologists have long been anticipating the slowdown of Moore’s Law, if not it’s very end. “Predictions of the death of Moore’s law are nearly as old as the forecast itself,” notes a recent Economist article. “Still, the law has a habit of defying the sceptics, to the great good fortune of those of us enjoying tiny, powerful consumer electronics. Signs are at last accumulating, however, which suggest the law is running out of steam. It is not so much that physical limits are getting in the way… it is mainly because of economics.”
“As originally stated by Mr Moore, the law was not just about reductions in the size of transistors, but also cuts in their price… New “fabs” (semiconductor fabrication plants) now cost more than $6 billion. In other words: transistors can be shrunk further, but they are now getting more expensive. And with the rise of cloud computing, the emphasis on the speed of the processor in desktop and laptop computers is no longer so relevant… Moore’s law will come to an end; but it may first make itself irrelevant.”
What happens next? I started thinking about this question several years ago, and the metaphor that came to mind was that the IT industry is entering its Cambrian phase.
The Cambrian geological period marked a profound change in life on Earth. Before it, most organisms were very simple, composed of individual cells and simple multi-cell organisms sometimes organized into colonies, such as sponges. After a couple of billion years, evolution deemed the cell to be good-enough, that is, it’s continued refinement did not translate into an evolutionary advantage.
Then around 550 million years ago a dramatic change took place, which is known as the Cambrian Explosion. Evolution essentially took off in a different direction, leading to the development of all kinds of complex life forms. “Over the following 70 to 80 million years, the rate of diversification accelerated by an order of magnitude and the diversity of life began to resemble that of today.”
The IT industry is now going through something similar. Over the past several decades, we’ve been perfecting our digital components - microprocessors, memory chips, disks, networking and the like, and we used them to develop families of computers, - mainframes, minicomputers, servers, PCs, laptops and so on.
But around ten years ago or so, the digital components started to become powerful, reliable, inexpensive, ubiquitous,… and good-enough to start moving to a new phase. The acceptance of the Internet introduced a whole new set of technologies and standards for interconnecting all these components. Today, digital components are becoming embedded into just everything, - smartphones, IoT devices, robots, consumer electronics, medical equipment, airplanes, cars, buildings, clothes and on and on and on. Even for data centers and large supercomputers, “The question is not how many transistors can be squeezed onto a chip, but how many can be fitted economically into a warehouse,” writes The Economist.
Technology continues to be very important to IT, but after 50 years it’s no longer the key driver of innovation, and computers are no longer its preponderant families. The digital world has now entered its own Cambrian age. Innovation has now shifted to the creation of all kinds of digital life forms, and to the data-driven analytic algorithms and cognitive designs that infuse intelligence into these artificial life forms.
“What could possibly go wrong?,” asks John Markoff in the last paragraph of The Next Wave. “There is an argument that these machines are going to replace us, but I only think that’s relevant to you or me in the sense that it doesn’t matter if it doesn’t happen in our lifetime. The Kurzweil crowd argues this is happening faster and faster, and things are just running amok. In fact, things are slowing down. In 2045, it’s going to look more like it looks today than you think.”
I wholeheartedly agree.