Similar Brain Glitch Found in Slips of Signing, Speaking
The discovery of a common neural mechanism in speech and sign errors one that occurs in just 40 milliseconds could improve recovery in deaf signers after a stroke.
When we speak, we give little thought to how the words form in our brain before we say them. It鈥檚 similar for deaf people using sign language. Speaking and signing come naturally, except when we stumble over words, or swap one word for another when we speak or sign too quickly, are tired or preoccupied.
Fluency and the occasional disfluency both happen because of how we choose what to say or sign, when a neural mechanism takes place in our brains as we make decisions and monitor how we communicate.
It鈥檚 this mechanism that fascinates 色情视频 researchers Stephanie Ries and Karen Emmorey in the School of Speech, Language and Hearing Sciences. Their analysis could help inform rehabilitation therapy for those relearning how to speak or sign after a stroke.
Using electroencephalogram (EEG) recordings, they studied how hearing and deaf signers process the act of signing and found the same monitoring mechanism took place in the brains of both groups. Among deaf signers, it was more prevalent with those for whom American Sign Language (ASL) is their first language.
鈥淲hen we are doing an action, whether it鈥檚 speaking, signing, pressing buttons or typing, we see the same mechanism,鈥 Ries said. 鈥淎ny time we are making a decision to do something, this neural mechanism comes into play.鈥
Their , published by MIT Press in the Journal of Cognitive Neuroscience on April 30, may advance our understanding of how deaf individuals recover their ability to sign after a traumatic brain injury or stroke, when they suffer aphasia: the inability to understand others or express themselves due to brain damage.
鈥淲hen stroke victims are more aware of their speech errors and have a better functioning speech monitoring mechanism, they have a better chance of recovering than those who don鈥檛 have that awareness,鈥 Ries said. 鈥淭his study helped us extend that understanding to signing ability for deaf people.鈥
Melding speech with sign language expertise
The work also represents a long-held dream to combine the skills and training of two researchers with niche expertise in complementary fields 鈥 speech monitoring and sign monitoring.
Ries is an assistant professor specializing in the neuroscience of speech and language disorders who first met Emmorey at a workshop on language production in 2007 when Ries was a Ph.D. student in Marseille. Emmorey, a distinguished professor, sign language expert and director of the Laboratory for Language and Cognitive Neuroscience at 色情视频, presented a study about sign monitoring which sparked an abiding interest in Ries, who wanted to work with Emmorey. When they crossed paths at another conference five years ago, Emmorey urged her to apply for the assistant professorship at 色情视频, and they eventually began working together.
鈥淚鈥檝e always been interested in what inner signing would be like, and if it鈥檚 similar to inner speech,鈥 said Emmorey, the study鈥檚 senior author. 鈥淚t鈥檚 an internal process. When you speak, you can hear yourself. But if you鈥檙e signing, are you seeing yourself like in a mirror, or is it a mental image of you signing, or a motor representation so you can feel how you sign?鈥 These were the underlying aspects of signing no one quite understood, and it has long been Emmorey鈥檚 goal to tease them apart so we truly understand what sign language processing is like. Knowing this will help sign language educators figure out the best learning strategy for signers, much like the techniques used to teach hearing people foreign languages.
Since Ries was already working on speech monitoring with hearing people in France, when she joined 色情视频, the two researchers combined their expertise to study sign monitoring in hearing and deaf people.
Monitoring for self-editing
They used the EEG data recorded with 21 hearing signers and 26 deaf signers in the Neurocognition Lab of Philip Holcomb and Katherine Midgley, colleagues in the psychology department. The participants were shown pictures to identify by signing, while wearing an EEG cap with 32-channel tin electrodes to monitor the mechanism behind signing.
鈥淲e wanted to study sign monitoring in-depth to understand the underlying mechanism and whether it鈥檚 universal,鈥 Ries said. 鈥淏efore people start to sign, you see this component rising, and we observed it happen with hearing signers as well, except it wasn鈥檛 as clear.鈥
This difference was possibly because deaf signers were more proficient in ASL than hearing signers. It鈥檚 important to note that both deaf and hearing signers are bilingual in English and ASL, except ASL is more dominant for deaf signers.
鈥淲hen we鈥檙e speaking we catch ourselves when we are about to make an error.
That鈥檚 thanks to this monitoring process which is located in the medial frontal cortex of the brain,鈥 Ries said. 鈥淚t peaks 40 milliseconds after you begin speaking, so it鈥檚 extremely fast. We make an error because we may not have selected the right word when semantically related words are competing in your brain.鈥
Words that share similar meanings such as 鈥榦ven鈥 and 鈥榝ridge鈥 or names may be switched in the brain (e.g., swapping your children鈥檚 names). Other times, syllables get transposed.
Such errors can happen in signing too, when signs for different words are mixed up or an incorrect handshape is swapped for the desired handshape, which indicates signers are actually assembling phonological units during language production, similar to assembling the phonemes in a spoken word.
鈥淟earning how sign production is represented in the brain will help us understand sign language disorders, and if a signer needs epileptic surgery we will know which part of the brain processes sign,鈥 Emmorey said.
The study鈥檚 co-authors include Linda Nadalet and Soren Mickelson, who were master鈥檚 students in speech language pathology, and Megan Mott, who was a master鈥檚 student in psychology.
鈥淚t was a really great way to get students from different labs to work together, and it was a very good collaborative effort among students and principal investigators,鈥 Ries said. She is now studying speech monitoring for patients with epilepsy.
Funding came from a grant from the 色情视频 Center for Cognitive and Clinical Neuroscience, a center of excellence designed to encourage interdisciplinary collaborations across campus. Emmorey and Ries are also funded by grants from the National Institute for Deafness and other Communication Disorders within the National Institutes for Health.