In the first week of October I participated in a Cognitive Systems Colloquium hosted by IBM at its Thomas J. Watson Research Center. IBM defines cognitive systems as “a category of technologies that uses natural language processing and machine learning to enable people and machines to interact more naturally to extend and magnify human expertise and cognition. These systems will learn and interact to provide expert assistance to scientists, engineers, lawyers, and other professionals in a fraction of the time it now takes.”
The need for such systems is a result of the explosive growth of data all around us. Not only are we now able to collect huge amounts of real-time data about people, places and things, but far greater amounts can be derived from the original data through feature extraction and contextual analysis. One of the key lessons from Watson, - IBM’s question-answering system which in 2011 won the Jeopardy! Challenge against the two best human Jeopardy! players, - was that the very process of analyzing data increases the amount of data by orders of magnitude.
This is challenging our ability to store and analyze all that data. The new generation of cognitive systems will require innovation breakthroughs at every layer of our IT systems, including technology components, system architectures, software platforms, programming environment and the interfaces between machines and humans.