What sign language has taught us about our brains

By on

stangor-fig09_015.jpg American Sign Language is more syntactically similar to spoken Japanese than it is to spoken English. Is there a difference in the way that our brains process signed and spoken languages?

 

 

I have recently completed Level 1 of British Sign Language (BSL) at Deaf Direct, Oxford, and through the year, I found that mnemonics greatly simplified my learning process. For example, to sign the colour blue you use the fingers of your right hand to rub the back of your left hand, my simple mnemonic for this sign being that our veins on the back of our hand appear blue. I was therefore forming an association between the word blue (English), the sign for blue (BSL), and the visual aid that links the two. However, the two languages differ markedly in their modalities, with English being a spoken, auditory-driven language, and BSL a visual one, and I was curious if our brains processed spoken and signed languages in the same way.

It seems that for the most part, they do.

Traditional theories of language processing point to two regions in the left hemisphere of the brain that were once thought to be chiefly responsible for producing and understanding spoken language – Broca’s area and Wernicke’s area (depicted in the title figure). However, we now know that there are other parts of our brain that also contribute to our ability to communicate, and that all these structures work together, rather than individually, to do so. Assigning complex functions to specific brain regions is an oversimplification that may only paint us half a picture, and studying sign language has given us a more complete understanding of the intricate way in which our brains perceive language.

The evidence for this comes from two kinds of studies – lesion analyses, which examine the functional consequences of damage to brain regions involved in language, and neuroimaging, which explores how these regions are engaged in processing language.

 Lesion studies

Lesion studies in hearing people have shown that damage to Broca’s area, which is located near the part of the motor cortex that controls the mouth and lips, usually gives rise to difficulties in the production of speech, but doesn’t adversely affect one’s ability to communicate, or understand conversation. So, a hearing person with a lesion in Broca’s area may not be able to form fluid sentences, but he/she could use single words, short phrases, and possibly nod or shake their head to gesture their responses.

The comprehension of speech, on the other hand, is largely believed to be processed within Wernicke’s area, which is located near the auditory cortex – the part of the brain that receives signals from the ears. Hearing people with Wernicke’s aphasia are usually fluent in producing speech, but may make up words (for example, “cataloop” for “caterpillar” shown in the video below)  and speak in long sentences that have no meaning.

However, if Broca’s area is involved solely in the production of speech, and Wernicke’s area in understanding speech sounds, then we might expect that visual languages like sign language remain unaffected when these areas are damaged.

Surprisingly, they do not.

One of the seminal studies in this field was by award-winning husband and wife team Edward Klima and Ursula Bellugi at the Salk Institute. Together with Gregory Hickok, they found that Deaf signers who had lesions in left hemisphere “speech centres” like Broca’s and Wernicke’s areas produced significantly more sign errors on naming, repetition and sentence-comprehension tasks than signers with damaged right hemispheres. Given that the right hemisphere of the brain is more involved in visuo-spatial functions than the left hemisphere, these findings may have been unexpected at first. However, this study has since been replicated and taken together, the literature seems to suggest that despite the differences in modality, signed and spoken languages have similar patterns of left lateralization in the brain.

This is not to say that the right hemisphere is not at all involved in producing and comprehending sign language, but these pioneering findings verify that despite the differences in modality, signed and spoken languages are similarly affected by damage to the left hemisphere of the brain.

Neuroimaging studies

Functional neuroimaging studies have for the most part corroborated the early findings of lesion studies. Despite the fundamental differences in input/output modes for signed and spoken languages, there are common patterns of brain activation when Deaf and hearing people process language.

For instance, analogous to their involvement in the production and perception of speech sounds, Broca’s area is also activated when producing signs, and Wernicke’s area is activated during the perception of sign language.

 

macsweeney

This picture taken from a study by MacSweeney and colleagues shows the regions activated by BSL perception in Deaf and hearing signers (first and second column) and those activated by English speech perception in hearing non-signers (third column). Of course, the differences in modality of the two languages are reflected in these patterns of activation: there is greater activation in visual areas (at the back of the brain) for Deaf and hearing people watching BSL, and greater auditory activation in hearing people listening to English. But for the most part, the recruitment of Broca’s and Wernicke’s areas, as well as the strong patterns of left lateralization in the brain seem to be similar for the perception of the two languages.

Because the left hemisphere controls actions for the right hand, there were initial concerns that left hemisphere involvement in sign language production was simply a reflection of hand preference, and not really an indication of language processing at all. These concerns were soon allayed when a study published in the Journal of Cognitive Neuroscience showed that Broca’s area was identically activated both when Deaf signers used their right dominant, and their left non-dominant hands to generate verb signs.

Busting some myths

Most importantly, these lesion and neuroimaging studies have helped clarify two facts. First, that language is not simply limited to hearing and speech, and that sign languages are complex linguistic systems processed much like spoken languages. Second, they also cemented our growing reservations of the oversimplified theories of language perception. Their involvement in processing sign language meant that we could no longer think of Broca’s and Wernicke’s areas exclusively as centres for producing speech and hearing sound, but rather as higher-order language areas in the brain.

Contrary to the common misconception, there is no universal sign language. According to a recent estimate, there are 138 variations of sign language in the world today, with structured syntax, grammar, and even regional accents. It is unfortunate then that a significant proportion of the global Deaf community is still battling for legal recognition of these languages.

Sign language is sometimes misguidedly looked upon as a “disability” language and simply a visual means of communicating spoken language, when it fact its linguistic construction is almost entirely independent of spoken language. For instance, American and British Sign Language are mutually incomprehensible, even though the hearing people of Britain and America predominantly share the same spoken language.

Knowledge of how sign languages are processed in the brain has not only furthered our understanding of the brain itself, but has also played a part in quashing the once widely believed notion that these signs were simply a loose collection of gestures strung together to communicate spoken language.

According to the convention, ‘‘Deaf” has been capitalized to refer to culturally Deaf people.

An edited version of this article has been republished on The Conversation UK.

The following two tabs change content below.
Sana Suri
Sana is a Neuroscience PhD student in the Department of Psychiatry, University of Oxford. She studies genetic risk factors for Alzheimer’s disease. Tweets from @sanasuri
Sana Suri

Latest posts by Sana Suri (see all)

Post a comment

You may use the following HTML:
<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>