Over the past few decades, Moore’s Law, processor speed and hardware scalability have been the driving factors enabling IT innovation and improved systems performance. But the von Nuemann architecture—which established the basic structure for the way components of a computing system interact—has remained largely unchanged since the 1940s. Furthermore, to derive value, people still have to engage with computing systems in the manner that the machines work, rather than computers adapting to interact with people the way they work.
With the continuous rise of big data, that’s no longer good enough.
We now are entering the Cognitive Systems Era, in which a new generation of computing systems is emerging with embedded data analytics, automated management and data-centric architectures in which the storage, memory, switching and processing are moving ever closer to the data.
Whereas in today’s programmable era, computers essentially process a series of “if then what” equations, cognitive systems learn, adapt, and ultimately hypothesize and suggest answers. Delivering these capabilities will require a fundamental shift in the way computing progress has been achieved for decades.