Sign In

Communications of the ACM

ACM News

The Need For Cognitive Computing

Artist's representation of cognitive computing.

The need for cognitive computing to help find the pearls of wisdom among the huge amounts of big data being harvested is acute.

Credit: Shutterstock

Thanks to its star turn on TV’s Jeopardy!, IBM’s Watson has become synonymous with cognitive computing, a technology gaining increasing importance and momentum in the enterprise. With more organizations recognizing the need to make sense of unstructured data, IBM in February launched three application programming interfaces (APIs) for developers to extend the capabilities of the Watson platform.

The Tone Analyzer, Emotion Analysis, and Visual Recognition APIs are available in beta and will enable developers to embed the technologies to create apps that can think, perceive, and empathize.

Cognitive computing technology finds and merges information on a topic without regard to origin or format, according to Sue Feldman, CEO of Synthexis, a consulting firm specializing in cognitive computing, search, and text analytics technologies. While it may answer questions, cognitive computing also explores, looking for patterns without needing a precise query. "Most importantly, it uses every trick in our technology book in order to give people the tools they need to see inside vast quantities of information,’’ says Feldman, who is also a member of the Cognitive Computing Consortium, a cross-industry group of organizations and individuals from the IT, academic and analyst communities.

The need for cognitive computing, Feldman says, is acute. "We are buried today in a digital mudslide; one in which information nuggets and dross are jumbled together. Worse, one person’s nugget is another’s garbage.  The value of information depends on who needs it, when they need it, and for what purpose."

Cognitive systems differ from current computing applications in that they move beyond tabulating and calculating based on preconfigured rules and programs, according to Feldman. Although they are capable of basic computing, they can also infer, and even reason.  

One such system, CustomerMatrix, came about as a result of a combination of expertise in big data analytics and "clear customer frustration with overcoming the two biggest derailing influences of traditional BI (Business Intelligence) projects: lack of data access and poor user adoption," says Guy Mounier, co-founder and CEO of the company CustomerMatrix.  He believes those challenges have always limited the ability of innovation teams to deliver successful business outcomes for business stakeholders.

The CustomerMatrix system combines machine learning, semantic, and natural language processing (NLP), he says.  The platform does a lot of upfront work with data from a multitude of sources that is structured and unstructured, as well as internal and external.  The goal, he says, is to find the complex connections between the data, in a process known as Knowledge Engineering.

"By making sure we get the data right first, we create an environment where today’s machine learning techniques, from all sources in the ecosystem, can then be applied to the data,’’ says Mounier. "This results in highly predictive recommendations and superior business outcomes, which drive new revenue."

CustomerMatrix’s customers include Fujitsu, KPMG, and Schneider Electric.

Another customer is using the cognitive computing platform to find deals an investment banking relationship manager did not know were potentially available to them, Mounier says. The technology identified colleagues the manager did not know personally "who can vector them into the right officers at the company needing the deal." It also provided specific recommendations on what action to request of their colleagues, such as sending them an email about an up-and-coming visit where the opportunity can be discussed, and having the bank included in the list of bidding financing organizations, he says. The result was a deal flow increased by numbers "north of $100 million," Mounier says.

In today’s business environment, he maintains, if you are not driving new value/revenue, "you are not going to prosper, and maybe not even survive."

Given the enormous amounts of data coming into organizations, CEOs and CXOs are paying a lot of attention to cognitive computing because "there are a lot of trapped insights in that data" that cannot be extracted without the help of some very advanced technology, agrees Guruduth Banavar, vice president of Cognitive Computing at IBM Research. The amount of data continues to growing exponentially, he says, but thanks to breakthroughs in machine learning and reasoning techniques and NLP, the data is no longer impenetrable.

Watson is a family of technologies that has been implemented as set of APIs on a platform called the Watson Developer Cloud, which other vendors can use to build apps on top of, Banavar says. IBM has also built several hundred apps that can operate on top of the platform for virtually every vertical industry, he adds.

IBM recently announced a collaboration with the New York Genome Center to create a "comprehensive and open repository of genetic data to accelerate cancer research and scale access to precision medicine using cognitive insights from IBM Watson." The implications are enormous; analyzing this data could give doctors the ability to deliver personalized treatment to patients.

Designing systems that can handle the information onslaught and deliver the information that is pertinent to someone at a particular place in time is a tall order, Feldman notes, adding, "This is the dream at the heart of cognitive computing: if we can mimic the way that people gather, organize, and synthesize information in order to solve an entire range of information problems, then we will have some help in extracting the nuggets from the mud."

Esther Shein is a freelance technology and business writer based in the Boston area.


No entries found