Quantcast

Language Development For Infants Includes Lip Reading

January 17, 2012

A new study published in the journal Proceedings of the National Academy of Sciences (PNAS) may give researchers new understanding in language development among infants and may even assist in diagnosing autism spectrum disorders in the future.

The study indicates that infants may be learning language not only through sound, as previously assumed, but also through lip reading, Mikaela Conley reports for ABC News.

Four-month-old infants, along with adults, spend more time looking at the speaker´s eyes when being spoken to, At age 6 months however, babies are known to begin shifting from the eye gaze to studying mouths when people talk to them.

It is during that stage when a baby´s babbling shifts from seemingly random noises into syllables and eventually into that first “mama” or “dada.”

David Lewkowicz, an expert on infant perceptual development at Florida Atlantic University and lead author of the study explains, “By this time at 12 months, babies are already producing their first words and have mastered the first sounds and structures of the language.”

“They no longer have to lip-read as they ramp up their first speech patterns and they are free to shift back to the eyes, where you find a great deal of social information. The eyes are the window to the brain, and by looking at the eyes, we are able to know what the other person is thinking and what they want, their desires,” he told Mikaela Conley of ABC News.

The new research offers more evidence that quality face time with your tot is very important for speech development, writes Associated Press (AP) reporter Lauran Neergaard.

“It´s a pretty intriguing finding,” University of Iowa psychology professor Bob McMurray, who also studies speech development, explained to Neergaard. The babies, “know what they need to know about, and they´re able to deploy their attention to what´s important at that point in development.”

Lewkowicz wondered whether babies look to the lips for cues, similar to how adults lip-read to decipher what someone´s saying at a noisy party. He and doctoral student Amy Hansen-Tift tested nearly 180 babies, groups of them at ages 4, 6, 8, 10 and 12 months.

The researchers showed videos of a woman speaking in English or Spanish to babies of English speakers. A camera mounted on a soft headband tracked where each baby was focusing their gaze and for how long.

When the speaker used English, the 4-month-olds gazed mostly into her eyes. The 6-month-olds spent equal amounts of time looking at the eyes and the mouth. The 8- and 10-month-olds studied mostly the mouth. At 12 months, attention started shifting back toward the speaker´s eyes.

Researchers are excited to use this research to study autism research, as well. A two-year-old child with autism pays more attention to a speaker´s mouth, according to past literature on the developmental disorder.

Attention to the mouth is a normal developmental phase during the first year, and the comparison could aid in autism diagnosis at an earlier age.

“Right now, the earliest one can diagnose a child with autism is 18 months, so this could possibly be a way in the future to diagnose infants as early as 12, 13 or 14 months if we find babies are not making a shift back to the eyes around this age,” Lewkowicz told Conley.

“If that is the case, this would be a huge step forward in the development of diagnostic tools for autism because it would be six months earlier than what we can do now. Because the brain is so elastic and there is an enormous proliferation of neuro structure during infancy, if we could pick up these difficulties as early as 12 months, we could begin to intervene at an earlier time and get far better outcomes in children,” he concluded.

On the Net:


Source: RedOrbit Staff & Wire Reports



comments powered by Disqus