Last week, the Subcommittee on Technology, Innovation and Competitiveness of the Senate Committee on Commerce, Science and Transportation held hearings on high-performance computing.
I had the honor to be one of the witnesses that testified before the Subcommittee and its chairman Senator John Ensign. I submitted my written testimony in advance, which I then summarized during the actual hearings and answered questions along with the other panelists.
I said that supercomputing was essential for innovation in the worlds of science and commerce, most of all because of its applications, including defense, energy, health care, science and engineering. It is thus very important that the Federal Government support basic research in supercomputing, especially pilots to develop working systems in applications that are key to national security, competitiveness and innovation.
I further said that while working in supercomputing in the last twenty years we had learned two key lessons. First, that in order to push the envelope of performance, applications and discovery, it is vital that industry work very closely with lead partners in research labs and universities. But, in addition to engaging in such advanced initiatives which are usually sponsored by government, it is really important that industry pay attention to marketplace requirements, including competitive prices and sophisticated software, in order to build a viable supercomputing business with clients around the world.
The complete draft of my oral remarks follows.
"On behalf of IBM, I want to thank you, Mr. Chairman for the opportunity to testify. I am Vice President for Technical Strategy and Innovation at IBM, and have been involved with supercomputing initiatives for over twenty years. With your permission, I will simply summarize my written testimony.
Today, IBM leads the industry with the world’s top three supercomputers and almost half the world’s top 500 supercomputers. We were first to deliver over 100 teraflops, or 100 trillion operations per second of peak performance – to the DOE’s Lawrence Livermore National Laboratory, where we also first demonstrated the practicality of using well over 100,000 microprocessors on a single problem. Likewise, we have been working with DARPA to help make very high-end systems more productive, and are investing in advanced hardware and software that will culminate in a system capable of more than a sustained petaflop.
In the process of all this, we have learned many lessons. Two are especially significant. First, it is vital to work closely with lead partners in research labs and universities to "push the envelope" of performance, applications and discovery. I cannot overemphasize the importance of these pilots to developing working systems for real research on important applications, and the fact that it is the Federal government that is instrumental in creating them.
Second, the marketplace is all-important. Many supercomputing companies have failed because they relied solely on government-based projects, and were heedless of marketplace requirements that they go beyond leading-performance to competitive prices, energy efficiency and sophisticated software and applications. While very proud of IBM's leadership in supercomputers, we are equally proud that it is an actual, viable business for us with clients around the world.
Why is a national supercomputing capability vital to the U.S.? Supercomputing systems and applications push the envelope in multiple dimensions: they analyze huge amounts of information; accurately simulate both the natural world and the world of man-made objects; and present the results in highly visual, realistic ways so we can interact with them.
Additionally, supercomputing architectures and applications foreshadow the future. If one is removed from the advanced research, new ideas and creative minds in supercomputing, one will inevitably misread the major trends in computing. Finally, supercomputers enable scientists to make discoveries that would be difficult – perhaps impossible -- to accomplish experimentally.
The supercomputing market has changed radically in the last decade. It was once a niche market because the hardware was so expensive. But that changed with the introduction of workstation and PC-based technologies. Today we are even working to build supercomputers with technologies from the worlds of consumer electronics and video games. All these approaches use components from high-volume markets. Thus, the costs are significantly lower than in the early days and the potential markets much bigger.
My basic point is this: A commercial business model has reduced costs and enabled us to address a vast spectrum of public and private sector applications. One-off machines built for a single mission are usually expensive, impractical in the marketplace, and not viable in the long term.
Progress in supercomputing hardware has been astounding. The real challenge, however, lies in both application software and systems software, as has been widely recognized in a number of studies. Software is so consequential because supercomputing’s value is not in the technology -- important as it is -- but in its applications – which makes software critical.
My formal testimony reviews the progress and promise of some key applications. There is enormous promise, both in classic, more “mature” supercomputing applications like defense and national security, science, engineering, and weather and climate; and in the newer application areas like energy, health care and bioinformatics, learning and training, and business in general.
In civilian nuclear energy, for example, the GNEP and ITER initiatives are excellent examples of government-industry-academic collaboration on matters of national and international importance, market-relevance, and timeliness. They deserve support.
Supercomputing today is essential for innovation in both the world of science and the world of commerce. It is an indispensable tool if our country is to thrive in a global economy that grows more competitive by the day. It is therefore essential to pass an Innovation Authorization bill this year, as in S 2802, as you, Senator Ensign, Senator Stevens and others in this Committee know. The Federal Government funds basic research and establishes priorities for research in the pursuit of innovation and competitiveness. That makes wise, sustainable policy choices critically important to a national supercomputing capability.
To realize that capability, Congress should clearly outline and invest in a long-term strategy. For example, the President’s Budget request for fiscal 2007 includes high-performance computing activities that range from biomedical computing to earth and space science research and many others. Clear direction and consistent funding will foster investment by industry and academia, so that together we can address the challenges that face our country and grow the capabilities of our knowledge.
This is the kind of joint effort between government, universities and private industry for which there is no substitute.
Thank You"
Comments