A recent issue of The Economist featured a technology focus on “The breakthrough AI needs” with eight articles on the topic. “Two years after ChatGPT took the world by storm, generative artificial intelligence seems to have hit a roadblock,” said the issue’s lead article. “The energy costs of building and using bigger models are spiralling, and breakthroughs are getting harder. Fortunately, researchers and entrepreneurs are racing for ways around the constraints. Their ingenuity will not just transform AI. It will determine which firms prevail, whether investors win, and which country holds sway over the technology.”
This is frightening for investors who’ve bet big on AI, but there is no reason to panic, said the article. “Plenty of other technologies have faced limits and gone on to prosper thanks to human ingenuity,” it added. “Already, developments in AI are showing how constraints can stimulate creativity.” In particular, The Economist mentions two such major innovations: the development of chips with the special purpose architectures needed to train and run AI models as fast and energy efficient as possible, and the development of smaller, more specialized domain specific models that consume a lot less energy than the very large models that rely on brute force computational power.
Let me discuss each of these two innovations.