The October 13 issue of The Economist included a special report on innovation. The lead article set the stage when it said, "Rapid and disruptive change is now happening across new and old businesses. Innovation, as this report will show, is becoming both more accessible and more global. This is good news because its democratization releases the untapped ingenuity of people everywhere and that could help solve some of the world's weightiest problems."
"What is innovation?," the article later asks and proceeds to provide its own answers. "Although the term is often used to refer to new technology, many innovations are neither new nor involve new technology." It further adds "One way to arrive at a useful definition is to rule out what innovation is not. It is not invention. New products might be an important part of the process, but they are not the essence of it. These days much innovation happens in processes and services. Novelty of some sort does matter, although it might involve an existing idea from another industry or country."
I very much agree with The Economist's broad view of innovation, expressed in the various excellent articles in this issue, as well as in similar articles in the past. I strongly believe that increasingly, the toughest problems, requiring the kinds of breakthrough thinking you get from the very best technologists and scientists, are out there in the real world - in the marketplace and in society at large. So, if that is where the problems that inspire breakthroughs are, then it makes sense that the top researchers and innovators should step out of their ivory towers - whether in academia, corporations or government - and personally learn about them. If they do, they can, one hopes, come up not only with elegant, innovative solutions to the problems, but also with new ideas that might lead to fundamental advances in science and technology.
By now, I believe that this view is well accepted in industry. But how about universities, especially the top research universities, the source of most of the fundamental knowledge on which the world relies? I was recently reminded that this broad, more applied view of research and innovation is causing some consternation in academia when in the same week the subject came up during meetings at two very different universities.
I spent the first part of that week at the University of Chicago, where I was attending a meeting of the Physical Sciences visiting committee. When I first came from Cuba to the US in October of 1960, I had two years of high school to go, which I finished at the U of C Laboratory Schools. I then went on college and later graduate school at U of C, and finally left for IBM Research in1970 after completing my studies for a PhD in physics, which I formally received two years later.
The University of Chicago is one of the world's foremost liberal arts universities. As I have gotten older, my appreciation has increased for the education I got there, and especially for the open, eclectic, multi-disciplinary culture I absorbed in my ten years in the place. I honestly think that the best preparation for general management in a complex institution these days is the kind of liberal arts education Chicago offers, - which prepares your mind to think about just about any kind of problem, - as opposed to the more specialized skills people learn in business schools.
As we were listening to a very good presentation about science at the interface of physics, chemistry and biology, the presenter alluded to one of the articles in The Economist's special report, wondering if the pendulum had swung too far toward the applied, versus the fundamental, aspects of innovation. He was referring to statements like, "As Thomas Edison, one of America's greatest inventors, put it, genius is 1% inspiration and 99% perspiration," and “Creativity is maybe 2% of the innovation process. It's a vanishingly small component, and it's the part you can acquire from outside the firm.”
I don't think that these statements and The Economist's articles in general are meant to belittle the need for highly creative new ideas - mostly coming from the world of fundamental research - as the catalysts for innovation. They are merely pointing out how much effort and creativity it takes to make an idea successful in the marketplace, regardless of how brilliant the idea is in the first place. This is often frustrating to the inventors of the new ideas, who see others garner not only the financial benefits but also the glory that comes from successfully commercializing and popularizing those inventions. Which is why, often, researchers and inventors step out of their labs and get involved in the more downstream, applied work that is necessary to successfully commercialize their ideas.
Even at MIT, which has a different culture from the U of C, you find concerns between fundamental vs applied research and innovation, something I found out when I was there later that same week to teach the course I am giving this semester. MIT is without doubt one of the top engineering schools in the world. As you would expect in such a great engineering culture, people at MIT are very good at solving real-world problems, no matter where they come from and how complex they are. In fact, the tougher the challenge, the better. One of the characteristics I admire about the people I have met at MIT in the past two years is how curious and open-minded everyone is.
But, during one of the meetings I had that week, a well-respected MIT professor expressed his feelings that a university should focus pretty much exclusively in inventing new technologies, and leave it to industry and government to worry about how to apply them to problems in business and society at large. I tried to argue, to no avail, that we badly need universities to become involved in tackling highly complex, Grand Challenge problems and applications. These include, for example, those involving healthcare systems, energy and the environment, as well as bringing technology and science to bear on services, including the creation of a new Services Sciences discipline. These Grand Challenge problems are so incredibly complicated, and require such top talent and new knowledge, that, in my opinion, the distinction between fundamental and applied innovation is practically non-existent.
In the end, great universities attract very good people whose research often leads to major innovations. Some of their work is more theoretical, some more experimental; some is more abstract, some more concrete; some is more lab-based, some is more market-based; some is more fundamental, and some is more applied. It is best not to be too concerned about the boundaries between academic research and industrial R&D across this multi-dimensional spectrum of innovations.
Perhaps one should go back to Supreme Court Justice Potter Stewart's famous 1964 statement about the definition of obscenity: "I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description; and perhaps I could never succeed in intelligibly doing so. But I know it when I see it, . . ."
When it comes to really good innovation, whether conducted by top people in academia, industry or government, regardless of how fundamental or applied it is, I may not know how to define it, but I am confident that I will know it when I see it.
You say "Which is why, often, researchers and inventors step out of their labs and get involved in the more downstream, applied work that is necessary to successfully commercialize their ideas."
Is this really done that often? Here in the UK I rather doubt it. Maybe a different culture exists in the US, one where it's OK to spend time away from research trying to make money. It certainly shouldn't come as a surprise that to create a commercial success requires effort in that direction, not just serendipity.
Posted by: Michael Saunby | November 26, 2007 at 05:15 AM
Irving,
I think your view about researchers stepping out of their "ivory towers" to engage in more applied research and innovation is spot on, particularly when one considers the tremendous talent the U.S. has in such institutions. Given the increasing debate about America's ability to compete in a much more hyper-competitive global marketplace, it would seem to me that one of our major competitive advantages is the knowledge power we have residing in such universities. To the extent we can leverage that knowledge in the applied sphere we can better secure our position in providing value to the global economy.
Best regards
Henry Engler
Posted by: Henry Engler | November 26, 2007 at 11:36 AM
Irving, very interesting comments. I completely agree with your point on our current state of "knowing when seeing it." That said, we are at a stage where more effort should be devoted to actually measuring indices of all the intangible value created by the innovation at both universities and R&D in industry. By no means I am saying that this is an easy task, but if we stop for a moment to think that NIH alone spends 28+ billion dollars/year in research, and this is just one institution, we should probably have better indices indicating where our investments are going and what our ROI looks like.
The situation in research today reminds me of federal economic decisions made prior to the 1940s, when the lack of quantitative macro-economic indices made policy making a guessing exercise at best. Again, by no means I am claiming that our current financial indices are perfect, but at least they give us a general picture of what is going on. We certainly cannot say that we have the same degree of trust in how we manage our current research portfolio.
Posted by: Ricardo Pietrobon | December 17, 2007 at 02:00 AM