Quantcast

Researchers Don’t Ask How You Feel, They Use Brain Activity Instead

June 20, 2013
Image Credit: Thinkstock.com

[ Watch the Video: Identifying Emotions Using Brain Activity ]

Brett Smith for redOrbit.com – Your Universe Online

Instead of asking someone how they feel, a group of researchers at Carnegie Mellon has found a way to identify a person’s emotion based on brain activity, according to a new report in the journal PLoS ONE.

Previous efforts to view emotions through brain imaging have been hampered by study participants’ reluctance to report emotion or emotional responses that are not consciously registered.

“This research introduces a new method with potential to identify emotions without relying on people’s ability to self-report,” said lead author Karim Kassam, an assistant professor of social and decision sciences at CMU. “It could be used to assess an individual’s emotional response to almost any kind of stimulus, for example, a flag, a brand name or a political candidate.”

To repetitively and reliably evoke different emotions from participants, the researchers first recruited actors from CMU’s School of Drama.

“Our big breakthrough was my colleague Karim Kassam’s idea of testing actors, who are experienced at cycling through emotional states,” said George Loewenstein, a professor of economics and psychology at the university. “We were fortunate, in that respect, that CMU has a superb drama school.”

While being shown words for nine different emotions “anger, disgust, envy, fear, happiness, lust, pride, sadness and shame” ten actors tried to enter each particular emotional state multiple times while inside an fMRI scanner.

To discount any potential effect of actors actively trying to emote, a second phase of the study presented participants with pictures of neutral and disgusting photos that they had not seen before to generate a genuine reaction. The computer models generated by the results of the first phase were able to correctly predict the emotional content of photos being viewed using the brain activity of the participant.

The researchers also found that relatively equivalent accuracy levels could be achieved by computer models even when using only one of a number of different regions of the human brain.

“This suggests that emotion signatures aren’t limited to specific brain regions, such as the amygdala, but produce characteristic patterns throughout a number of brain regions,” said co-author Vladimir Cherkassky, from CMU’s psychology department.

The research team found their system was best with respect to happiness and least accurate for identifying envy. It rarely confused positive and negative feelings, identifying lust most reliably due to its distinct pattern of neural activity.

“We found that three main organizing factors underpinned the emotion neural signatures, namely the positive or negative valence of the emotion, its intensity “mild or strong, and its sociality” involvement or non-involvement of another person,” said co-author Marcel Just, a CMU professor of psychology. “This is how emotions are organized in the brain.”

The research team said they plan to use their model on a number of emotion research projects, including identifying the suppression of emotions and experiencing multiple emotions at once, such as feelings of joy and envy felt upon hearing a friend’s good news.

The CMU research is part of the University’s recently launched Brain, Mind and Learning initiative, which is dedicated to innovation in the laboratory and focusing on real-world problems.


Source: Brett Smith for redOrbit.com - Your Universe Online



comments powered by Disqus