Ever since cloud computing first appeared in the IT world, people have struggled to define what it is and why it’s so important. “There is a clear consensus that there is no real consensus on what cloud computing is,” said the organizer of a 2008 cloud conference in his closing remarks. Just about everyone agreed that something big and profound was going on, although we were not totally sure what it was yet.
Is cloud the 21st century version of time sharing, enabling users to get virtual access to different kinds of IT resources and software capabilities without having to own computers? Is it a return to centralized computing, driven by the rising complexities, management costs and energy inefficiencies of distributed systems? Is it the evolution to utility computing, where, as with electricity and water, you get your IT from large service providers and pay for it based on usage?
Or, as The Economist said in an excellent 2008 article, is cloud the next step in the brief history of computers, as the machines “are evaporating altogether and becoming accessible from anywhere”. . . as computing is becoming “more and more disembodied and will be consumed where and when it is needed.”
“The rise of the cloud is more than just another platform shift that gets geeks excited,” it presciently added. “It will undoubtedly transform the information technology (IT) industry, but it will also profoundly change the way people work and companies operate. It will allow digital technology to penetrate every nook and cranny of the economy and of society, creating some tricky political problems along the way.”