*** This page is archived and provided for reference purposes only ***

Skip Over Navigation Links

NIH Research Matters

November 16, 2009

Words and Gestures Are Translated by Same Brain Regions

Your ability to make sense of Groucho's words and Harpo's pantomimes in an old Marx Brothers movie takes place in the same regions of your brain, according to new research. A better understanding of these brain areas may help in developing treatments for certain language and communication disorders.

a photo of a mime holding his hand to his ear.

Sign language is largely processed in the same brain regions as spoken language, including the inferior frontal gyrus in the front left side of the brain and the posterior temporal region toward the back left side of the brain. That isn't surprising, because sign language operates in the same way as spoken language does, with its own vocabulary and rules of grammar. But researchers haven't known if non-language-related gestures—the hand and body movements we use that convey meaning on their own—are also processed in the same brain regions.

Researchers at NIH's National Institute on Deafness and Other Communication Disorders (NIDCD), in collaboration with colleagues from Hofstra University School of Medicine and San Diego State University, explored 2 types of gestures: pantomimes, which mimic objects or actions such as juggling balls, and emblems, which signify abstract concepts such as a hand sweeping across the forehead to indicate "it's hot in here!" or a finger to the lips to signify "be quiet."

While inside an MRI scanner, 20 healthy volunteers watched video of a person either acting out the gesture types or voicing phrases that the gestures represent. As controls, volunteers also watched clips of the person using meaningless gestures or speaking words that had been chopped and rearranged so the brain wouldn't interpret them as language.

In the online early edition of the Proceedings of the National Academy of Sciences on November 18, 2009, the researchers reported seeing areas uniquely activated for symbolic gesture and spoken language. However, they also saw regions that were highly activated for both gesture and spoken language in the inferior frontal and posterior temporal areas—the long-recognized language regions of the brain.

"If gesture and language were not processed by the same system, you'd have spoken language activating the inferior frontal and posterior temporal areas, and gestures activating other parts of the brain," says Dr. Allen Braun, senior author. "But in fact we found virtual overlap."

This finding, he says, suggests that these brain regions could be the evolutionary starting point from which language originated. "Our results fit a longstanding theory which says that the common ancestor of humans and apes communicated through meaningful gestures and, over time, the brain regions that processed gestures became adapted for using words," Braun says.

"In babies, the ability to communicate through gestures precedes spoken language, and you can predict a child's language skills based on the repertoire of his or her gestures during those early months," says Dr. James F. Battey, Jr., director of NIDCD. "These findings not only provide compelling evidence regarding where language may have come from, they help explain the interplay that exists between language and gesture as children develop their language skills."

Related Links:

Contact Us

E-mail: nihresearchmatters@od.nih.gov

Mailing Address:
NIH Research Matters
Bldg. 31, Rm. 5B64A, MSC 2094
Bethesda, MD 20892-2094

About NIH Research Matters

Editor: Harrison Wein, Ph.D.
Assistant Editors: Vicki Contie, Carol Torgan, Ph.D.

NIH Research Matters is a weekly update of NIH research highlights from the Office of Communications and Public Liaison, Office of the Director, National Institutes of Health.

ISSN 2375-9593

This page last reviewed on December 4, 2012

Social Media Links