Affectiva Wins NSF Award to Enable Facial Expression Recognition Over the Web
Affectiva, an MIT. Media Lab spin-off that makes technologies to measure emotion, today announced that it has won a National Science Foundation (NSF) grant to develop an online version of its technology that enables computers to recognize human expressions and deduce emotional and cognitive states.
Waltham, MA (PRWEB) January 31, 2011
Affectiva, which makes technologies to measure emotion, today announced that it has won a National Science Foundation (NSF) grant to develop an online version of its technology that enables computers to recognize human expressions and deduce emotional and cognitive states.
Affectiva is a Massachusetts Institute of Technology (MIT) Media Lab spin-off founded by two scientists who came together originally to create technology to help people on the autism spectrum better understand emotion. The company is commercializing new emotion measurement technologies to make them broadly accessible and affordable for commercial, clinical and academic use.
The $150,000 NSF grant will fund a six-month project to move Affectiva’s facial expression recognition technology (Affdex) to an Internet cloud platform. Affdex commercializes the MIT FaceSense technology, an NSF-funded, vision-based computational system that reads states such as liking and confusion from facial-video using any webcam. FaceSense has received recognition from Wired, New Scientist, New York Times and more.
The cloud version of Affdex addresses a costly business problem, understanding how people really feel in order to create products or experiences that are engaging and that people want or like. For example, people could opt in to Affdex and turn on a webcam to share their reaction to a video, a game or a website experience with the contents’ creators.
Affdex not only allows more accurate understanding of an important aspect of human communication — emotion — it helps democratize emotion research by making it accessible, user-friendly and affordable for large and small organizations. The goal is a technology service that truly transforms the way customers and businesses communicate about product experiences.
“The NSF grant is an important step toward helping us open up the science of emotion measurement and make it massively available,” said Affectiva co-founder Dr. Rana el Kaliouby, who led the invention of the facial expression technology as a researcher at the University of Cambridge and at the MIT Media Lab.
Affectiva is led by chief executive officer Dave Berman, who prior to Affetiva was president of worldwide sales and services at Cisco WebEx. Dr. el Kaliouby co-founded Affectiva in 2009 with MIT Media Lab’s Dr. Rosalind Picard to address the demand from researchers and companies in need of the technology they created through their NSF-funded research.
The National Science Foundation http://www.nsf.org
The National Science Foundation (NSF) is an independent federal agency that supports fundamental research and education across all fields of science and engineering. In fiscal year (FY) 2010, its budget is about $6.9 billion. NSF funds reach all 50 states through grants to nearly 2,000 universities and institutions. Each year, NSF receives over 45,000 competitive requests for funding, and makes over 11,500 new funding awards. NSF also awards over $400 million in professional and service contracts yearly.
About Affectiva http://www.affectiva.com
Founded in 2009, Affectiva grew out of collaborative research at the MIT Media Lab to help people on the autism spectrum. It applies innovations in affective computing to help understand how people feel in order to improve products and experiences. Affectiva’s customers include leading companies and universities conducting market and clinical research. Its products include the QÃ¢“¢ Sensor wearable biosensor and Affdex facial expression recognition technology. Affectiva is privately held with funding from individuals and the Peder Sager Wallenberg Charitable Trust, represented by Lingfield AB.
# # #
For the original version on PRWeb visit: http://www.prweb.com/releases/prweb2011/01/prweb5019164.htm