News
Artificial Intelligence and Machine Learning

AI and Emotions

Posted
Digital assistants and many apps seek to become ever more intimate with your emotional state.
Artificial intelligence-equipped products and services are learning to respond to your emotional state.

Researchers working have been working on the development of artificial intelligence (AI) developments that analyze and respond to your emotions, and the fruits of their labors are beginning to be commercialized.

San Diego, CA-based retail technology firm Cloverleaf, for example, is outfitting retail shelves with tiny cameras that sense your mood and product interest, then serve up tiny ads on the sides of shelves to spike that interest further.

Los Angeles, CA-based video deposition company MediaRebel is using AI technology to offer lawyers greater insights into recorded witness testimony. Specifically, the video depositions are analyzed by MediaRebel AI software, which reads facial expressions and emotions to ensure that what people say matches up with what their emotions are saying.

Researchers in the Personal Robots Group of the Massachusetts Institute of Technology (MIT) Media Lab have developed a playful robot equipped with AI that teaches children new words by reading their emotional state through a camera, then adjusting its teaching approach accordingly.

Implementations of Emotion AI vary company by company, but the core approach remains the same: enable a device to observe and analyze users' emotions so it can respond more intelligently in any given scenario.

"Emotion is a central aspect of the human condition," says Richard Yonck, a futurist and author of "Heart of the Machine: Our Future in a World of Artificial Emotional Intelligence." Yonck says emotion "informs us, it generates values in our interpretation of our world; it plays an enormous role in our social communications and relationships."

Cloverleaf's shelfPoint digital display is a typical implementation. Cloverleaf CEO Gordon Davidson explains the product utilizes the Affectiva Emotion AI software development kit to detect emotion in real time on device, along with an optical sensor. The result is that shelfPoint "is able to look at consumers' faces to identify key landmarks on a face. From these landmarks, we can quickly see anger, sadness, disgust, happiness/joy, surprise, fear, contempt, in addition to standard census information such as male/female, age range, ethnicity,"

The shelfPoint product analyzes emotional and related inputs, then reaches into Cloverleaf's database to find and display the most appropriate advertising along the sides of store shelves, Davidson says.

A great deal of Emotion AI research and development has focused on using cameras to observe facial gestures and body movement, in the quest to identify people's emotions. Emotion AI firm Affectiva is working to enhance those inputs by adding voice analysis to the mix, according to CEO Rana el Kaliouby. She explains voice analysis will give Affectiva's software another way to read the emotions of the people the software is analyzing.

Another Emotion AI firm, TAWNY, is pushing machine perception even further by adding sensing of heart rate and skin changes, according to CEO Michael Bartl. "The idea of TAWNY is to match this data with human conditions such as emotion, stress, mood, level of productivity, or happiness," Bartl says. As with Affectiva's expansion into voice analysis, reads on heart rate and skin changes give TAWNY's AI additional ways to detect what really going on emotionally with any given subject.

In the long term, getting multiple perspectives on a subject's emotions is expected to enable the AI analyzing those emotions to continually refine its ability to grow more perceptive about what is truly going on emotionally with any subject, according to Danny Lange, vice president of Artificial Intelligence & Machine Learning at Unity Technologies, which offers a platform for creating two-dimensional, three-dimensional, virtual reality, and augmented reality games and apps.  "It will never stop learning and sharing with other instances of the AI."

Lange adds, "Of course, these AIs will exchange information about us between themselves to make them even better-performing."

Yonck agrees: "These systems will be able to deliver what we need, often before we're aware of the need ourselves."

In the near term, a number of Emotion AI experts see these breakthroughs making major inroads in the auto industry, where manufacturers are eager to add emotional heft to what has always been a visceral purchase for many. Porsche, Daimler, and BMW are all exploring Affectiva's AI software, along with automotive safety supplier Autoliv, as well as Renovo, which markets the Renovo Aware operating system for automated mobility, according to Yonck.

"Even for autonomous vehicles, it is becoming relevant to know how the passenger feels about the ride, as they (passengers) increasingly value the in-cabin experience over the vehicle's performance," Lange says.

Moreover, adding Emotion AI technology to cars could make the road safer for everyone, especially if the technology could sense that you are "stressed or drowsy," and recommend you drive more cautiously, according to Bartl.

Meanwhile, look for digital assistants like Amazon.com's Alexa to grow ever more intimate with your emotions, Yonck says, and get ready for retailers to build on the initial success of emotion-sensing shopping tools like Cloverleaf's shelfPoint.

"Many startups and major companies are already well underway developing this," Yonck says.

In addition to the aforementioned auto companies, Palo Alto, CA-based Lily AI, which offers an AI software that personalizes fashion retailer Web sites for each visitor, and Element Data of Bellevue, WA, which offers AI-driven decision-making software called Decision Cloud, are also among the early players in this field.

Joe Dysart is an Internet speaker and business consultant based in Manhattan, NY, USA.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More