Quantcast

Hand Gentures Often Help Determine What Is Being Said

June 23, 2014
Image Credit: Ralphot/Thinkstock.com

Alan McStravick for redorbit.com – Your Universe online

The basics of communication rely on a number of individual components. Not only do you need the participants in the communication – the sender and the receiver – but each party has to be able to encode and decode the message. Communication can consist of both verbal and non-verbal cues. Non-verbal cues, such as hand gestures, have long been thought to be aids in the encode/decode process, helping the receiver to infer the intended meaning of the message from the sender. It turns out actions like a simple gesture may be far more important to communication than once we thought.

From a simple flick of the wrist to raising one’s hand in mock or legitimate frustration, we have all employed the use of gesticulation in our day-to-day communications. However, have you ever caught yourself gesturing in conversation only to realize the person on the other side of the phone line is incapable of seeing the non-verbal cue and incorporating it into their decode process? As it turns out, it is a common human trait that we simply can’t remain still in conversation, even when the person with whom we are communicating is in a completely separate location.

This is “…because gestures and words very probably form a single ‘communication system’, which ultimately serves to enhance expression intended as the ability to make oneself understood,” explained Marina Nespor, a neuroscientist at the International School for Advanced Studies (SISSA) of Trieste.

Nespor, working with fellow researchers Alan Langus, also of SISSA, and Bahia Guellai of the Université Paris Ouest Nanterre La Défence, published the results of their study of gesticulation in communication in the journal Frontiers in Psychology. Their results demonstrate the role of gestures in speech “prosody”.

Prosody, defined by linguists as both the intonation and rhythm of spoken language, would seem to exclude the use of gestures in conversation. Intonation and rhythm are how the simple declarative statement “This is an apple” can swiftly and easily be changed to the surprise question “This is an apple?”

The interesting research out of SISSA challenges the notion of exclusivity of prosody to spoken language, however.

According to Langus, “The prosody that accompanies speech is not modality specific. Prosodic information, for the person receiving the message, is a combination of auditory and visual cues.” He continued, “The superior aspects (at the cognitive processing level) of spoken language are mapped to the motor-programs responsible for the production of both speech sounds and accompanying hand gestures.”

In the conduct of their study, Nespor, Langus and Guellai employed a cohort of 20 Italian speaking participants. Each of the 20 subjects were made to listen to a series of utterances that, dependent on different prosodies, could be understood in two separate and distinct ways. A translation of one of the examples could, in English, mean, ‘As you for sure have seen, the old lady blocks the door,’ or with different intonation and rhythm could mean ‘As you for sure have seen, the old bar carries it.’ The change in meaning arises due to the relationship the Italian word ‘vecchia’ has with the main verb in the sentence, ‘sbarra’, (either as subject or adjective).

The study participants were exposed to the individual utterances in two formats, audio-only and a video presentation. The video presentation allowed the participants to also see accompanying gestures along with fluctuations in the intonation and rhythm of the speaker. The researchers, however, presented both matched and mismatched versions in the video. A matched example would present gestures that correspond with the prosody of the spoken sentence. A mismatched example would utilize gestures that actually match the other meaning of the spoken sentence.

“In the matched conditions there was no improvement ascribable to gestures: the participants’ performance was very good both in the video and in the “audio only” sessions. It’s in the mismatched condition that the effect of hand gestures became apparent,” explains Langus. “With these stimuli the subjects were much more likely to make the wrong choice (that is, they’d choose the meaning indicated in the gestures rather than in the speech) compared to matched or audio-only conditions. This means that gestures affect how meaning is interpreted, and we believe this points to the existence of a common cognitive system for gestures, intonation and rhythm of spoken language.”

“In human communication, voice is not sufficient: even the torso and in particular hand movements are involved, as are facial expressions,” concludes Nespor.

Our hand gestures are but another tool in our communication toolbox that can be used to further and more accurately convey the meaning behind our words.


Source: Alan McStravick for redorbit.com - Your Universe online



comments powered by Disqus