New Computer Model Can Recognize 21 Distinct Emotional Expressions
redOrbit Staff & Wire Reports – Your Universe Online
While research projects tend to focus primarily on the six basic emotions of happiness, surprise, sadness, anger, fear, and disgust, researchers from Ohio State University have developed a way that will allow more than triple the number of documented emotion-related facial expressions that can be used for cognitive analysis.
Writing in the latest edition of the journal Proceedings of the National Academy of Sciences, the OSU research team members explained how they discovered a way for computers to recognize 21 distinct facial expressions, including those that are complex or apparently contradictory in nature (such as “happily disgusted” or “sadly angry”).
“We’ve gone beyond facial expressions for simple emotions like ‘happy’ or ‘sad.’ We found a strong consistency in how people move their facial muscles to express 21 categories of emotions,” said cognitive scientist Aleix Martinez, an associate professor of electrical and computer engineering at the Columbus, Ohio-based university. “That is simply stunning. That tells us that these 21 emotions are expressed in the same way by nearly everyone, at least in our culture.”
The model created by Martinez and his colleagues will reportedly help map emotions in the brain more precisely than previously possible, and could also help diagnose and treat autism, post-traumatic stress disorder (PTSD) and similar mental disorders. Previously, cognitive scientists have limited their research to the six basic emotions due to the fact that the facial expressions associated with them were believed to be apparent and comprehensible.
However, the OSU investigative team compared attempting to determine the function of the brain, including how and why our faces reveal our feelings, to painting using only the primary colors in that it cannot provide an actual, lifelike image of the subject. So they set out to expand the metaphorical color palette by developing a wide range of different emotional categories that researchers can model using the proposed computer simulation.
“In cognitive science, we have this basic assumption that the brain is a computer. So we want to find the algorithm implemented in our brain that allows us to recognize emotion in facial expressions,” Martinez said. “In the past, when we were trying to decode that algorithm using only those six basic emotion categories, we were having tremendous difficulty. Hopefully with the addition of more categories, we’ll now have a better way of decoding and analyzing the algorithm in the brain.”
He and his colleagues recruited 130 female volunteers and 100 male participants and photographed them making different faces in response to a variety of verbal clues such as receiving unexpectedly good news (“happily surprised”) or smelling an unpleasant odor (“disgusted”). Out of the 5,000 total images, they tagged prominent landmarks for facial muscles – the corners of the mouth or the outer edge of the eyebrow, for example.
They used the Facial Action Coding System method, a body language analysis tool developed by psychologist Paul Ekman, and searched its data for ways in which the expressions were similar and/or different. Ultimately, they determined that there were several different “compound emotions” – a total of 21 combinations of the six basic emotions that are expressed the same way by just about all people.
According to the university, the model could use particular expressions to determine the degree to which basic and compound emotions were characterized. For example, 99 percent of the time, happiness is expressed by “drawing up the cheeks and stretching the mouth in a smile,” while 92 percent of the time, surprised individuals “opened their eyes wide and dropped their mouth open.”
They also determined the compound emotion identified as “happily surprised” was expressed roughly 93 percent of the time with both “the wide-open eyes of surprise and the raised cheeks of happiness” and “a mouth that was a hybrid of the two – both open and stretched into a smile.”
Likewise, “happily disgusted,” described by the authors as the feeling experienced by someone while watching a gross-out comedy in which something happens that is both humorous and somewhat unsettling, combines the “scrunched-up eyes and nose” of disgust and the smile usually flashed by people who are happy.
The computer simulation itself is designed to be a tool for basic research in cognition. Martinez believes that it could potentially be used to treat disorders that involve emotional triggers (PTSD, for instance) or the inability to recognize the emotions of other men and women (such as autism).
“For example, if in PTSD people are more attuned to anger and fear, can we speculate that they will be tuned to all the compound emotions that involve anger or fear, and perhaps be super-tuned to something like ‘angrily fearful’? What are the pathways, the chemicals in the brain that activate those emotions?” he said.
“We can make more hypotheses now, and test them. Then eventually we can begin to understand these disorders much better, and develop therapies or medicine to alleviate them,” added Martinez, who was assisted on the National Institutes of Health-sponsored study by OSU doctoral students Shichuan Du and Yong Tao.
Image 2 (below): Researchers at the Ohio State University have found a way for computers to recognize 21 distinct facial expressions — even expressions for complex or seemingly contradictory emotions such as “happily disgusted” or “sadly angry.” Here, a study participant makes three faces: happy (left), disgusted (center), and happily disgusted (right). Researcher Aleix Martinez described “happily disgusted” as “how you feel when you watch one of those funny ‘gross-out’ movies and something happens that’s really disgusting, but you just have to laugh because it’s so incredibly funny.” The study gives cognitive scientists more tools to study the origins of emotion in the brain. Credit: Courtesy of The Ohio State University