How our brain processes sign language

How our brain processes sign language

Sign language mainly includes hands and fingers, but also upper body, head, line of sight, eyebrows and mouth. © FluxFactory/iStock

Sign language is significantly slower than spoken language and relies on visual signals rather than what is heard. But the processing of information signals in the brain shows surprising parallels in both cases, as a new study shows. Accordingly, the neuronal activity in the brain areas involved in language processing synchronizes with the temporal sequence of speech sounds or gesture movements.

Non-hearing people can communicate in various sign languages. The information is not transmitted through vocalizations, but through the simultaneous movement of several body parts. The main areas used are hands and fingers, as well as the upper body, head, line of sight, eyebrows and mouth. On average, speakers produce two movement signals per second – meaning sign language is significantly slower than spoken language. But how exactly does our brain translate this visual information into words? How does language processing change with more practice? And are there parallels to the processing of spoken language?

Illustration of brain waves
The temporal synchronization took place using delta brain waves in the frequency range between 0.5 and 2.5 Hertz. © Andreus/iStock

Spanish and Russian sign languages ​​in comparison

Researchers led by Chiara Rivolta from the Basque Center for Cognition, Brain and Language in Donostia-San Sebastián have now examined in more detail how the human brain processes sign language. To do this, they recorded videos of short stories in Spanish and Russian sign language and then had 28 hearing test subjects watch these videos. Half of the test subjects only spoke Spanish sign language, the other half knew neither language. At the same time, the researchers used magnetoencephalography to record the subjects’ brain activity. They also analyzed the speakers’ subtle movements.

The analysis showed that brain areas and neural networks that are responsible for processing visual signals, including the right temporal lobe, were active in all test subjects. In all participants, this brain activity also synchronized with the visible movements of the sign speakers, especially the hand movements. Corresponding to the frequency of the movement signals of around two hertz, delta brain waves in the frequency range between 0.5 and 2.5 hertz were primarily involved in this temporal synchronization. “Higher frequency bands such as theta, however, do not seem to play a relevant role in the generation and processing of characters,” says Rivolta’s team. Also noteworthy: The synchronization in the delta area was stronger and more pronounced in the subjects who understood the sign language used than in the subjects without knowledge of the sign language used, especially in the right temporal lobe.

Better language skills, better neural processing

The findings show that our brain synchronizes not only with the rhythmic patterns of spoken language – a phenomenon already known from previous studies – but also with the movement patterns of sign language. Rivolta and her colleagues suspect that this temporal synchronization of neuronal signals facilitates language understanding in both cases. The more familiar a person is with the spoken or sign language used, the more fine signals are processed in the language centers of the brain and the stronger the synchronization with what is heard or seen, the team concludes.

Source: Chiara Rivolta (Basque Center for Cognition, Brain and Language) et al.; Proceedings of the National Academy of Sciences, doi: 10.1073/pnas.2512665122

Recent Articles

Related Stories