Quantcast

Study Reveals Language Influences What We See

August 27, 2013
Image Credit: Thinkstock.com

April Flowers for redOrbit.com – Your Universe Online

People naturally assume the sense of sight takes in the world as it is, simply passing on what the eyes collect from light reflected by the objects around us. The truth is more complicated, however, as the eyes do not work alone. Our vision is not only a function of incoming visual information, but how that information is interpreted in light of other visual experiences as well. A new study from the University of Wisconsin-Madison and Yale University reveals language may influence our sight as well.

University of Wisconsin–Madison cognitive scientist and psychology professor Gary Lupyan, and Emily Ward, a Yale University graduate student, demonstrate that words can play a powerful role in what we see. Their findings were published in a recent issue of Proceedings of the National Academy of Sciences (PNAS).

“Perceptual systems do the best they can with inherently ambiguous inputs by putting them in context of what we know, what we expect,” Lupyan says. “Studies like this are helping us show that language is a powerful tool for shaping perceptual systems, acting as a top-down signal to perceptual processes. In the case of vision, what we consciously perceive seems to be deeply shaped by our knowledge and expectations.”

Lupyan says these expectations can be altered with a single word.

The researchers used a technique called continuous flash suppression to render a series of objects invisible for a group of volunteers. This allowed them to show how deeply words influence perception.

Each participant was shown a picture of a familiar object in one eye. The objects included items such as a chair, a pumpkin or a kangaroo. In their other eye, the participants were shown a series of flashing, “squiggly” lines.

“Essentially, it’s visual noise,” Lupyan says. “Because the noise patterns are high-contrast and constantly moving, they dominate, and the input from the other eye is suppressed.”

Each study participant heard one of three things immediately before seeing the images: the word for the suppressed object (“pumpkin,” when the object was a pumpkin), the word for a different object (“kangaroo,” when the object was actually a pumpkin), or just static.

The participants were asked to indicate whether they saw something or not. The researchers found when the word they heard matched the object that was being suppressed by the visual noise, the subjects were more likely to report they did indeed see something than in cases where the wrong word or no word at all was paired with the image.

“Hearing the word for the object that was being suppressed boosted that object into their vision,” Lupyan says.

Hearing a word that did not match the suppressed image hurt the participant’s chances of seeing an object.

“With the label, you’re expecting pumpkin-shaped things,” Lupyan says. “When you get a visual input consistent with that expectation, it boosts it into perception. When you get an incorrect label, it further suppresses that.”

Continuous flash suppression has been shown to interrupt sight so thoroughly that the brain receives no signals to suggest the invisible objects are perceived, even implicitly.

“Unless they can tell us they saw it, there’s nothing to suggest the brain was taking it in at all,” Lupyan says. “If language affects performance on a test like this, it indicates that language is influencing vision at a pretty early stage. It’s getting really deep into the visual system.”

The new study reveals a deeper connection between language and simple sensory perception than previously thought. This connection made the researchers wonder about the extent of language’s power. They suggest the influence of language may extend to other senses as well.

“A lot of previous work has focused on vision, and we have neglected to examine the role of knowledge and expectations on other modalities, especially smell and taste,” Lupyan says.

“What I want to see is whether we can really alter threshold abilities,” he says. “Does expecting a particular taste for example, allow you to detect a substance at a lower concentration?”

For example, Lupyan says if you are drinking a glass of milk, but thinking about orange juice, your thoughts might change the way you experience the milk.

“There’s no point in figuring out what some objective taste is,” Lupyan says. “What’s important is whether the milk is spoiled or not. If you expect it to be orange juice, and it tastes like orange juice, it’s fine. But if you expected it to be milk, you’d think something was wrong.”


Source: April Flowers for redOrbit.com - Your Universe Online



comments powered by Disqus