Sign In

Communications of the ACM

ACM News

Humans and Algorithms Struggle to Read Emotions When Faces Are Obscured

View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
Woman in face mask shows various facial recognition points based on emotion

The performance of artificial systems is greatly affected when it comes to categorizing emotion from natural images; even just the sun's angle or shade can influence outcomes.

A recent study shows both algorithms and humans struggle to accurately determine different emotions when people's faces are partially obscured by face masks or sunglasses. But artificial systems are more likely to misinterpret emotions in unusual ways.

The study presented images of people with various emotional facial expressions and wearing two different types of masks—the full mask used by frontline workers and a recently introduced mask with a transparent window to allow lip reading. Depending on the type of covering, the accuracy for both people and artificial systems varied. For instance, sunglasses obscured fear for people while partial masks helped both people and artificial systems to correctly identify happiness. Interestingly, artificial systems performed significantly better than people in recognizing emotions when the face was not covered—98.48% compared to 82.72% for seven different types of emotions.

From The Conversation
View Full Article


No entries found