Research and Advances
Computing Applications

Perceptual -User Interfaces: Affective Perception

Imagine you have just logged into your new computer, and it is displaying some of its fancy features. It then begins asking you a series of questions. You are in a hurry to get to your email, but it pops up with yet another start-up window to set some option that is not necessary to configure now. You exhale, frown, mutter something under your breath, and proceed to type with a little more speed and intensity.
Posted
  1. Article
  2. References
  3. Author

This scenario is one of many where a computer has caused an affective or emotional response. In this case, it was irritating its most important customer—the user. Despite the mantra of human-computer interaction—to design computers so as not to frustrate the user—computers still irritate, confuse, and annoy a great many people. We can all think of ways these interactions might be redesigned so that it would not frustrate us. Providing a delay start-up options button, for example, may be ideal for one person; however, that solution may confuse another person. There is rarely a one-size-fits-all solution for the growing variety of computer users and interactions.

People have skills for detecting when someone is annoyed or frustrated and for adapting to such affective cues. For example, if a human mentor is helping you with a task, then he or she can generally see when all is going well as opposed to when might be good to interrupt. Three factors are especially important: Perceiving the situation; perceiving affective expression; and knowing how interrupting at such a time was received previously. If, for example, a student is repeatedly doing something wrong (situation), but they are acting very curious and interested (affect), then the mentor might leave them alone. If however, their frustration is growing to the point of quitting (same situation, different affect), then it might be good to interrupt. The ultimate strategy involves more than affect perception, but affect perception is critical.

One of the goals of affective computing research is to give computers the ability to help communicate emotion—receiving and sending emotional cues [4]. If the interaction is primarily between you and the computer, then the goal of computer-emotion perception is to see whether such things pleased you, and thereby adjust its response more helpfully. This research involves comfortably sensing user’s affective information, reasoning about the situation, and synthesizing a sensitive and respectful response.

A number of labs have built tools that enable affective cues to be directly communicated or indirectly. Emotional valence (liking or disliking) can be directly communicated by clicking on a thumbs-up or thumbs-down icon, or whacking physical icons of similar appearance, which may appear on the side of the computer or computing appliance. Intensity can be expressed via pressure applied to the mouse or to a physical icon. Valence, intensity, and other aspects of affective state can also be sensed indirectly from visual, auditory, or physiological cues.

Although facial expression and tone of voice may seem most natural for human affect recognition, it is important to respect the privacy wishes of users, and not to impose such technology. Several users have expressed a preference for giving affective feedback via direct methods such as clicking on an icon or squeezing/hitting something. It would be ironic and irresponsible if affect-sensing technology, built to incorporate user feelings in the interaction, did not respect a user’s feelings about how sensing was conducted.

One of the problems with so many smart features these days is they sense what you’re doing but not how you’re doing it. Popular word-processing software can sense you have misspelled a word, but not how you have tensed your muscles and grumbled as it keeps auto-correcting what you had, in fact, typed correctly. Even a dog senses how its master is responding and associates this feedback with its behavior. An infant senses how something is said long before he or she can understand what was said. To adapt behavior intelligently, living systems first perceive affective feedback.

Emotion plays a role in human perception. If subjects are asked to quickly jot words they hear, then they are more inclined to spell "presents" than "presence" if they are happy, and to spell "banned" than "band" if they are sad [2]. Similar results occur when subjects look at ambiguous facial expressions [1]. A variety of influences of emotion on perception have been described in [3].

Computers might potentially reason about the influence of mood on perception, to help them better predict what a person is likely to perceive. The computer that sees you are in a bad mood may predict that neutral language is likely to be perceived as negative, given that a negative mood may bias the ambiguous neutral stimulus negatively. The computer might thereby adjust its word choice in a way that would hardly be noticed, except that the communication would seem to have proceeded smoothly.

The ways in which affect is perceived, and in which it influences perception, are manifold and subtle. When they are missing, then human-human interaction is severely impaired. To the extent that human-machine interaction is natural and social, then machines will likely need affective skills. When the machine is being used as a hammer, then there is no need to clutter it with such features; however, when it is functioning as an assistant, helping you handle information overload and other complex tasks that require discerning and adapting to your individual goals, standards, and preferences, then affect perception will be a sign of intelligence.

Back to Top

Back to Top

    1. Bouhuys, A., Bloem, G.M., and Groothuis, T.G.G. Induction of depressed and elated mood by music influences the perception of facial emotional expression in healthy subjects. J. Affective Disorders 33 (1995), 215–226.

    2. Halberstadt, J.B., Niedenthal, P.M., and Kushner, J. Resolution of lexical ambiguity by emotional state. Psychological Science 6, 5 (Sept. 1995), 278–282.

    3. Mayer, J.D., and Salovey, P. The intelligence of emotional intelligence. Intelligence 17 (1993), 433–442.

    4. Picard, R.W. Affective Computing. The MIT Press, Cambridge, MA, 1997.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More