Home → News → Humans and Algorithms Struggle to Read Emotions When... → Full Text

Humans and Algorithms Struggle to Read Emotions When Faces Are Obscured

By The Conversation

August 12, 2021

[article image]


A recent study shows both algorithms and humans struggle to accurately determine different emotions when people's faces are partially obscured by face masks or sunglasses. But artificial systems are more likely to misinterpret emotions in unusual ways.

The study presented images of people with various emotional facial expressions and wearing two different types of masks—the full mask used by frontline workers and a recently introduced mask with a transparent window to allow lip reading. Depending on the type of covering, the accuracy for both people and artificial systems varied. For instance, sunglasses obscured fear for people while partial masks helped both people and artificial systems to correctly identify happiness. Interestingly, artificial systems performed significantly better than people in recognizing emotions when the face was not covered—98.48% compared to 82.72% for seven different types of emotions.

From The Conversation
View Full Article

0 Comments

No entries found