How can we best anticipate the future of a complex, fast changing industry like IT? Which hot technology innovations, - e.g., artificial intelligence, the blockchain, cloud computing, - will end up having a big impact and which are destined to fizzle? What can we learn from the IT industry’s 60-year history that might help us better prepare for whatever lies ahead?
A major way of anticipating the future of any economic or social entity, - be it a company, industry, university, government agency or city, - is to explore and learn from its history. While there’s no guarantee that historical patterns will continue to apply going forward, they might well be our most important guides as we peer into an otherwise unpredictable future.
I’ve been involved with computers since the early 1960s, first as a student at the University of Chicago, then in my long career at IBM, and subsequently through my relationship with a number of companies and universities. I’ve thus had a ringside seat from which to observe the journey the IT industry’s been on since those early days.
Let me share some of my personal impressions of this journey through the lens of three key areas, each of which has played a major role throughout IT’s history, and will continue to do so well into its future: data centers, transaction processing, and data analysis.
Data Centers → Cloud
Back in 1962, the year I entered college, I got a part-time programming job in the newly created university computation center. In those days, I keypunched my programs onto the 80-column so-called IBM cards, brought the card decks to the machine room and gave them to an operator, who then submitted them to the computer via a card reader. Jobs were run one-at-a-time, generally resulting in a multi-hour wait. I nostalgically associate those early computer days with Chicago-style pizza, because many a night, while waiting for my job to run, I went with friends to Uno’s or Due’s, - which are still serving their famous deep-dish pizza in Chicago’s near north side.
What happened to those computers and machine rooms that we rarely see these days? In a special report on Corporate IT published by The Economist in October of 2008, technology editor Ludwig Siegele offered a very elegant answer to this question:
“In the beginning computers were human. Then they took the shape of metal boxes, filling entire rooms before becoming ever smaller and more widespread. Now they are evaporating altogether and becoming accessible from anywhere.”
“That is about as brief a history of computers as anyone can make it. The point is that they are much more than devices in a box or in a data centre. Computing has constantly changed shape and location - mainly as a result of new technology, but often also because of shifts in demand.”
“Now… computing is taking on yet another new shape. It is becoming more centralised again as some of the activity moves into data centres. But more importantly, it is turning into what has come to be called a cloud, or collections of clouds. Computing power will become more and more disembodied and will be consumed where and when it is needed.”
As Siegele predicted, computers and data centers have been disappearing into the cloud. It’s what has always happened when consumer-oriented technologies become ubiquitous in society, - electricity, telephones, television. While consumer devices are now everywhere, their back-ends are nowhere to be seen. They are quietly doing their highly disciplined work of generating and transmitting their output to the many billions of users and devices out there. With the success of cloud computing, IT has now been undergoing a similar transformation.
Cloud represents the industrialization of the data center. Many data centers evolved over the years with limited architectural discipline or company-wide governance. IT organizations often spent the bulk of their energies on the maintenance and integration of their legacy applications, some having been developed by different departments within the business and some having been inherited through mergers and acquisitions. Not surprisingly, these older companies were then challenged to respond to fast changing market conditions, especially when it came to new customer-facing applications, which generally require massive scalability, flexibility and agility.
IT has had to become much more disciplined in every aspect of its operations. Data centers have now become the production plants of cloud-based services, a transformation that’s been pioneered by born-to-the-cloud companies like Amazon, Google, and Salesforce. Cloud computing requires well-engineered infrastructures, applications and services.
The architectural standards and management disciplines of public cloud providers are being embraced by many older companies, as they develop private clouds so they too can efficiently deliver high quality services to their own customers, business partners and employees. At the same time, companies are acquiring IT or business services from the growing number of companies offering such services in the marketplace. Most companies are embracing a hybrid model - delivering some services from their own data centers, and acquiring others from service providers.
Transaction Processing → Distributed Ledgers (Blockchain), Cybersecurity
Transaction processing is one of the earliest applications of computers, used to automate highly structured business processes like financial payments, inventory management and airline reservations. Over time, sophisticated applications were developed to help manage more complex operations, including enterprise resource planning, customer relationship management and human resources.
Transaction volumes have been rapidly increasing over the past 20 years. First came the explosive growth of the Internet and World Wide Web, attracting 10s -100s of millions of users to the world of computing, and giving rise to all kinds of online transactions.
Several years later the Internet transitioned to its mobile phase, based on billions of users around the world connected via smart personal devices and broadband wireless networks, giving rise to many innovative apps. Mobile digital payments, for example, is still in its early stages, but over time, will likely generate huge volumes of transactions all around the world.
More recently, we’ve seen the rise of the Internet of Things, supporting 10s of billions of smart devices on its way to 100s of billions, and generating an ever growing volume of transactions from our newly digitized physical world, including homes, transportation, cities and even our bodies.
In addition to keeping up with the explosive scalability requirements, IT infrastructure face very serious challenges in the areas of security, privacy and robustness. Many of these new applications, - dealing with payments, health, cities and others areas, - are mission critical and must be up just about all the time, even while withstanding constant cyber attacks. In addition, IT is increasingly operating in an environment where a significant number of players cannot be trusted.
We need mission critical IT infrastructures that are capable of near unlimited scalability, near 100% reliability, privacy, and security, and the ability to deal with transactions where the parties involved don’t necessarily trust each other. Blockchains and blockchain-inspired distributed ledger architectures are among the most exciting technologies with the potential to revolutionize transaction processing.
Data Analysis → Data Science, Cognitive, AI
The relationship between computing and data goes back to the early days of what we then referred to as the data processing industry. Beyond their use in operations, the data generated by transactional applications were also used to improve the efficiency, financial performance, and overall management of the organization. The information was generally collected in data warehouses, and a variety of business intelligence tools were developed to analyze the data and generate the appropriate management reports.
These early analytics applications dealt mostly with structured information. But at the same time, research communities were developing methodologies for dealing with high volumes of unstructured data, as well as analytical techniques, like data mining, for discovering patterns and extracting insights from all that data.
The explosive growth of the Internet since the mid-1990s has taken data analysis to whole new levels. Data is now being generated by just about everything and everybody around us, including the growing volume of online and offline transactions, web searches, social media interactions, billions of smart mobile devices and 10s of billions of IoT smart sensors.
As a result, we can now capture as data many aspects of our world that have never been quantified before. All these data is now enabling us to better understand the world’s physical, economic and social infrastructures, as well as to infuse information-based intelligence into every aspect of their operations. It’s making it possible to not just better understand what’s happening in the present, but to also make more accurate predictions about the future.
This new data-centric era of computing goes by different names, depending on which aspect one wants to emphasize: big data, analytics, data science, cognitive computing, artificial intelligence. But regardless of what we call it, the industry is finally doing justice to its name: the Information Technology industry.
In the end, innovation is a journey into the future. It’s not possible to anticipate where we might be heading without understanding where we’ve come from. It’s been quite an incredible journey. Over the decades, we’ve seen digital technologies increasingly permeate just about every nook and cranny of the economy, society and our personal lives. Given the dramatic technology advances still to come, we can expect an equally exciting journey well into the future.
Wonderfully concise description of where we have been in the Information Technology industry and where we may be headed.
Posted by: Jay W | March 22, 2016 at 10:45 AM