Quantcast

Computer Software Accurately Predicts Student Test Performance

April 15, 2014

Study Shows Automatic Recognition of Facial Expressions Can Track Student Engagement in Real Time

SAN DIEGO, April 15, 2014 /PRNewswire/ — Emotient, the leading provider of facial expression recognition data and analysis, and the University of California, San Diego announced publication of a joint study by two Emotient co-founders affiliated with UC San Diego, together with researchers from Virginia Commonwealth University and Virginia State University. The study demonstrates that a real-time engagement detection technology that processes facial expressions can perform with accuracy comparable to that of human observers. The study also revealed that engagement levels were a better predictor of students’ post-test performance than the students’ pre-test scores.

http://photos.prnewswire.com/prnvar/20140415/73302

The early online version of the paper, “The Faces of Engagement: Automatic Recognition of Student Engagement,” appeared today in the journal, IEEE Transactions on Affective Computing.

“Automatic recognition of student engagement could revolutionize education by increasing understanding of when and why students get disengaged,” said Dr. Jacob Whitehill, Machine Perception Lab researcher in UC San Diego’s Qualcomm Institute and Emotient co-founder. “Automatic engagement detection provides an opportunity for educators to adjust their curriculum for higher impact, either in real time or in subsequent lessons. Automatic engagement detection could be a valuable asset for developing adaptive educational games, improving intelligent tutoring systems and tailoring massive open online courses, or MOOCs.” Whitehill (Ph.D., ’12) recently received his doctorate from the Computer Science and Engineering department of UC San Diego’s Jacobs School of Engineering.

The study consisted of training an automatic detector, which measures how engaged a student appears in a webcam video while undergoing cognitive skills training on an iPad(®). The study used automatic expression recognition technology, to analyze students’ facial expressions on a frame-by-frame basis, and estimate their engagement level.

“This study is one of the most thorough to date in the application of computer vision and machine learning technologies for automatic student engagement detection,” said Dr. Javier Movellan, co-director of the Machine Perception Lab at UC San Diego and Emotient co-founder and lead researcher. “The possibilities for its application in education and beyond are tremendous. By understanding what parts of a lecture, conversation, game, advertisement or promotion produced different levels of engagement, an individual or business can obtain valuable feedback to fine-tune the material to something more impactful.”

In addition to Movellan and Whitehill, the study’s authors include Virginia Commonwealth professor of developmental psychology, Dr. Zewelanji Serpell, as well as Yi-Ching Lin and Dr. Aysha Foster from the department of psychology at Virginia State.

About Emotient – Emotion Recognition Technology (www.emotient.com)
Emotient, Inc., is the leading authority in facial expression analysis. Emotient’s software translates facial expressions into actionable information, thereby enabling companies to develop emotion-aware technologies and to create new levels of customer engagement, research, and analysis. Emotient’s facial expression technology is currently available as an API for Fortune 500 companies within consumer packaged goods, retail, healthcare, education and other industries.

Emotient was founded by a team of six Ph.D.s from the University of California, San Diego, who are the foremost experts in applying machine learning, computer vision and cognitive science to facial behavioral analysis. Its proprietary technology sets the industry standard for accuracy and real-time delivery of facial expression data and analysis. For more information on Emotient, please visit www.emotient.com.

Photo – http://photos.prnewswire.com/prnh/20140415/73302

SOURCE Emotient


Source: PR Newswire



comments powered by Disqus