In the current issue of the Proceedings of the National Academy of Sciences, they report that they were able to more than triple the number of documented facial expressions that researchers can now use for cognitive analysis.
“We’ve gone beyond facial expressions for simple emotions like ‘happy’ or ‘sad.’ We found a strong consistency in how people move their facial muscles to express 21 categories of emotions,” said Aleix Martinez, a cognitive scientist and associate professor of electrical and computer engineering at Ohio State. “That is simply stunning. That tells us that these 21 emotions are expressed in the same way by nearly everyone, at least in our culture.”
The resulting computational model will help map emotion in the brain with greater precision than ever before, and perhaps even aid the diagnosis and treatment of mental conditions such as autism and post-traumatic stress disorder (PTSD).
Since at least the time of Aristotle, scholars have tried to understand how and why our faces betray our feelings—from happy to sad, and the whole range of emotions beyond. Today, the question has been taken up by cognitive scientists who want to link facial expressions to emotions in order to track the genes, chemicals, and neural pathways that govern emotion in the brain.
Until now, cognitive scientists have confined their studies to six basic emotions—happy, sad, fearful, angry, surprised and disgusted—mostly because the facial expressions for them were thought to be self-evident, Martinez explained.
But deciphering a person’s brain functioning with only six categories is like painting a portrait with only primary colors, Martinez said: it can provide an abstracted image of the person, but not a true-to-life one.
What Martinez and his team have done is more than triple the color palette—with a suite of emotional categories that can be measured by the proposed computational model and applied in rigorous scientific study.
“In cognitive science, we have this basic assumption that the brain is a computer. So we want to find the algorithm implemented in our brain that allows us to recognize emotion in facial expressions,” he said. “In the past, when we were trying to decode that algorithm using only those six basic emotion categories, we were having tremendous difficulty. Hopefully with the addition of more categories, we’ll now have a better way of decoding and analyzing the algorithm in the brain.”
They photographed 230 volunteers—130 female, 100 male, and mostly college students—making faces in response to verbal cues such as “you just got some great unexpected news” (“happily surprised”), or “you smell a bad odor” (“disgusted”). In the resulting 5,000 images, they painstakingly tagged prominent landmarks for facial muscles, such as the corners of the mouth or the outer edge of the eyebrow. They used the same method used by psychologist Paul Ekman, the scientific consultant for the television show “Lie to Me.” Ekman’s Facial Action Coding System, or FACS, is a standard tool in body language analysis.
They searched the FACS data for similarities and differences in the expressions, and found 21 emotions—the six basic emotions, as well as emotions that exist as combinations of those emotions, such as “happily surprised” or “sadly angry.”
The researchers referred to these combinations as “compound emotions.” While “happily surprised” can be thought of as an expression for receiving unexpected good news, “sadly angry” could be the face we make when someone we care about makes us angry.
The model was able to determine the degree to which the basic emotions and compound emotions were characterized by a particular expression.
For example, the expression for happy is nearly universal: 99 percent of the time, study participants expressed happiness by drawing up the cheeks and stretching the mouth in a smile. Surprise was also easily detected: 92 percent of the time, surprised participants opened their eyes wide and dropped their mouth open.
“Happily surprised” turned out to be a compound of the expressions for “happy” and “surprised.” About 93 percent of the time, the participants expressed it the same way: with the wide-open eyes of surprise and the raised cheeks of happiness—and a mouth that was a hybrid of the two—both open and stretched into a smile.
The computer model also gives researchers a tool to understand seemingly contradictory emotions. “Happily disgusted,” for instance, creates an expression that combines the scrunched-up eyes and nose of “disgusted” with the smile of “happy.”
Martinez explained this emotion as “how you feel when you watch one of those funny ‘gross-out’ movies and something happens that’s really disgusting, but you just have to laugh because it’s so incredibly funny.”
While the model is meant to be a tool for basic research in cognition, Martinez can foresee potential applications in the treatment of disorders that involve emotional triggers, such as PTSD, or a lack of recognition of other people’s emotions, such as autism.
“For example, if in PTSD people are more attuned to anger and fear, can we speculate that they will be tuned to all the compound emotions that involve anger or fear, and perhaps be super-tuned to something like ‘angrily fearful’? What are the pathways, the chemicals in the brain that activate those emotions? We can make more hypotheses now, and test them,” he said. “Then eventually we can begin to understand these disorders much better, and develop therapies or medicine to alleviate them.”
Coauthors on the study included doctoral students Shichuan Du and Yong Tao. The work was funded in part by the National Institutes of Health.
#
MEDIA CONTACT
Register for reporter access to contact detailsCITATIONS
Proceedings of the National Academy of Sciences