One of the major forces for innovation in computing has been the acceleration of application run-times from minutes or hours to seconds or less, thus enabling them to become interactive. The whole nature of an application changes qualitatively and its value improves significantly when it is able to quickly respond to our every action and request.
I have observed this transition from long-running to interactive applications several times in my career. The advent of time-sharing and other technologies in the '60s and '70s ushered in transaction processing applications like airline reservation systems and ATMs. Then personal computers introduced us to spreadsheets, word processing and many other personal productivity applications. More recently the Internet, coupled with broadband networks, has brought us access to the Web, as well as e-business, search, blogging and new applications of all sorts.
Why do applications feel so different when they respond instantaneously? This goes to the essence of how humans prefer to deal with the world and solve problems. We do something, get a response, and then adjust and build on that response. It feels natural to break a problem into a series of steps and keep adjusting to the feedback, whether it is driving a car, talking to a person or interacting with a computer application. In addition, since our brains are wired for sight and sound, as applications become increasingly interactive, the interfaces we use for dealing with them have had to become more visual and aural so they feel natural to us as well. That is what graphical user interfaces or GUIs have strived for from their beginning, and why highly visual interfaces, inspired by those in video games, hold so much promise for the future.
Supercomputing applications, which are compute-intensive and/or data-intensive and thus among the most time-consuming, are now poised to make the transition to interactivity. Continuous improvements in microprocessors, storage and other technologies are a major factor in this transition. Equally important are the advanced architectures that permit supercomputers to be built from the inexpensive components of the PC and consumer electronics worlds, so that the considerable computing capacity generally required to support interactive applications can be delivered at affordable prices. For example, Blue Gene uses inexpensive, low-power versions of the Power architecture and achieves its high performance by combining very large numbers of them. This parallel architecture has enabled Blue Gene to occupy five of the top fifteen positions in the Top500 list of the most powerful supercomputers in the world, including number one - the system at the Lawrence Livermore National Lab, which contains over 131,000 processors and delivers peak performance of 367 teraflops.
Further supporting this trend toward interactive supercomputing applications, IBM announced earlier this month a new, high-performance generation of BladeCenter, a system architecture for integrating and packaging large numbers of standard computing components of all kinds. The new version of BladeCenter increased its internal bandwidth by nearly an order of magnitude, permitting its use in more powerful supercomputing applications. At the same time, we announced a new blade based on the Cell processor, the high-performance processor derived from the Power Architecture and jointly designed by Sony, Toshiba and IBM for the Play Station 3.
To best appreciate the potential implications of interactive supercomputing applications with human-like user interfaces, think of what we use supercomputers for today. First, we analyze very large amounts of information looking for features or patterns, e.g., the search for new elementary particles in high energy physics. The petroleum industry analyzes large amounts of seismic data for indications of oil or natural gas. Information analysis is increasingly important in medicine as new kinds of bioinformatics applications like those based on genomics are used for breakthrough approaches to diagnosis and treatment.
Supercomputing applications also encompass modeling and simulation. Weather prediction is already having an impact on our everyday lives. Much research in physics, chemistry and biology involves extensive computations, such as the search for new materials that are both very light and very strong. In engineering, simulation is now integrated in the design of almost any complex object, including aerodynamics for more fuel efficient airplanes, structural analysis for safer cars, and seismic analysis to help bridges and buildings withstand earthquakes. Economic modeling is another class of simulation widely used in business, government and academia to predict the potential impact of different strategies.
While many supercomputing applications will continue to be time-consuming and new advances in supercomputing will be used to attack whole new problems such as modeling the human brain, many supercomputing applications, or at least subsets of them, can now become interactive in nature. Imagine for example a physician looking for the best treatment for a specific patient with a specific ailment using very fast search technologies that permit the exploration of different treatment options, much the way we now use search engines to find information in the Web. A petroleum engineer might accelerate the discovery process by quickly trying different kinds of analysis and visualization to pinpoint a potential oil field. Automotive engineers could continuously refine their designs to achieve a balance of aesthetics, safety and economics, much the way we keep formatting and re-formatting documents until satisfied with the result. Or a surgeon might prepare for a new kind of operation by training on a "surgical simulator" inspired by flight simulators that can explore not just what happens when everything goes as planned, but also what can go wrong, and then use a similar system during the surgery to analyze and display real-time information as the operation proceeds.
I think we would all agree that the impact of such applications would be nothing short of revolutionary, not only for science and engineering, but for business, health care, education and government as well. Even more important, such interactive supercomputing applications make the machines more human-like, and therefore easier for us to deal with on our terms. After all, constantly interacting with and reacting to the world by processing information and looking for patterns and features is what our brains have evolved to do so well. Ever since the advent of computers we have been frustrated with how poorly they perform in these most basic of human functions even compared to a three year old. Perhaps our computers are finally growing up.
Irving-san,
Today the Internet brings an information flood to us.
And so we seek to discover a jewel from the inside of the information flood.
I mean the role of mathmatics would be more and more important than ever before.
As more of the world's information is pooled into mathematics,
the realm of numbers becomes an ever larger meeting ground.
Mathmatics might bring the new workload of "High-Performance Computing".
Posted by: Makio Yamazaki | February 22, 2006 at 04:49 PM
Dr. Wladawsky-Berger, I've been reading your blog with great interest - especially about your advocacy for next generation interfaces that are more intuitive. We are working (with potential partnerships with IBM) in many of the fields you mentioned from knowledge mapping to healthcare to financial services and simulations. As Craig Barrett, former CEO of Intel said: "Unified Field is the leader in 4D Visualization". We are talking with your cell group, but I thought it worthwhile to let you know of our existence. You are one of our heros.
Best, Eli Kuslansky
Managing Partner
Unified Field
Posted by: Eli Kuslansky | April 07, 2006 at 11:09 AM