Read My Lips – Genetic Algorithm ‘Teaches’ Computers To Read Lips
redOrbit Staff & Wire Reports – Your Universe Online
Researchers in Malaysia are teaching a computer to interpret human emotions based on lip patterns in order to improve the way people interact with computers.
The research, which is published in the International Journal of Artificial Intelligence and Soft Computing, could also allow disabled people to better use computer-based communications devices, such as voice synthesizers, more effectively.
“In recent years, there has been a growing interest in improving all aspects of interaction between humans and computers, especially in the area of human emotion recognition by observing facial expression,” said the researchers, led by Karthigayan Muthukaruppanof Manipal at the International University in Selangor, Malaysia.
The scientists developed their system using a genetic algorithm that improves with each iteration to match irregular ellipse fitting equations to the shape of a human mouth displaying different emotions. The team used photos of individuals from South-East Asia and Japan to train the computer to recognize the six commonly accepted human emotions — happiness, sadness, fear, anger, disgust, surprise — and neutrality. The algorithm then analyzed the upper and lower lip as two separate ellipses.
The work was based on previous studies that provided an understanding of how emotion can be recreated by manipulating a representation of the human face on a computer screen. The research is currently informing the development of more realistic animated actors, and even the behavior of robots. However, the inverse process in which a computer recognizes the emotion behind a real human face remains a challenge.
It is widely known that many deeper emotions involve more than simple movements of the mouth. For instance, a genuine smile involves the flexing of muscles around the eyes, and eyebrow movements are almost universally essential to the subconscious interpretation of a person’s feelings. However, the lips remain a critical part of the outward expression of emotion.
In the current study, the researchers’ algorithm successfully classified all seven emotions, along with a neutral expression. The researchers suggested that initial applications of such an emotion detector could involve, for instance, helping disabled patients lacking speech to interact more effectively with computer-based communication devices.