Many AI teams have bet big on models that can determine peoples’ emotional state by analyzing their face. But recent work shows that facial expressions actually provide scant information about a person’s feelings.
What’s new: An in-depth review of the scientific literature shows there are very few rules when it comes to how people use their faces to show emotion. For instance, the researchers found that a scowl indicates anger only 30 percent of the time.
What they found: Psychologists from universities in the US and Europe spent two years poring over 1,000 studies, including their own, examining facial expressions of emotion. They identified three categories of shortcomings:
- Researchers don’t account adequately for variability in the emotions people intend to convey, and how others read them. The person making a face might be feeling something completely different from what an observer perceives, and vice versa.
- Many studies assume there’s a consistent, species-wide map connecting certain sets of facial movements to emotional categories. If such a map exists, it contains many more side roads, dead ends, and detours than the literature describes.
- Much research ignores cultural differences. A grin can be pleasant in one society but hostile in another. Also, context matters. A person scowling at a book might be angry at its message or just trying to decipher tricky wordplay.
Behind the news: Most emotion recognition systems are based on ’60s-era research by psychologist Paul Ekman. He argued that humans share a vocabulary of facial configurations tied to emotion.
Why it matters: The emotion-detection industry is worth $20 billion, including marketing, security, and road safety, according to The Verge. But there’s more than money at stake. Systems designed for security and safety may not be doing their job because they're based on faulty research.
We’re thinking: Emotional expressions common to every human may exist. But if so, they’re more nuanced than current systems are designed to handle. Further research will help develop and validate AI's ability to recognize subtle muscle movements and contextual cues.