Researchers at Universidad Carlos III de Madrid (UC3M) and Universidad de Granada (UGR) have developed a computer system that automatically recognizes the emotional state of a person that is speaking to it.
"Thanks to this new development, the machine will be able to determine how the user feels and how [he or she] intends to continue the dialogue," says UC3M professor David Grill.
The system focuses on the emotions of anger, boredom, and doubt by analyzing 60 acoustic parameters, including tone of voice, speed of speech, duration of pauses, and the energy of the voice signal. In addition, information regarding how the dialogue developed was used to adjust for the probability that the user was in one emotional state or another.
"We have developed a statistical method that uses earlier dialogues to learn what actions the user is most likely to take at any given moment," the researchers say.
From Universidad de Carlos III de Madrid
View Full Article
Abstracts Copyright © 2011 Information Inc. , Bethesda, Maryland, USA
No entries found