Language is a system of communication that goes beyond spoken words and can incorporate nonverbal cues such as gestures that happen while speaking. Co-speech hand gestures are spontaneous and natural hand movements that accompany what is being said while speaking. These gestures are not random and are closely tied to the verbal message a speaker is trying to get across. These gestures can be very beneficial for enhancing comprehension of communication by providing a visual aid for speech. Co-speech hand gestures are not the same as sign language, though, as sign language is a linguistic system and co-speech is almost an additional or complementary visual aid that is helpful for comprehension. Research has shown that co-speech hand gestures are incredibly important for memory processing and learning, specifically in early development in children.
Dr. Wakefield and her team are closely researching how bilingual children’s language proficiency helps in their ability to process information while using these co-speech hand gestures. The study “Language Proficiency Impacts the Benefits of Co-Speech Gesture for Narrative Understanding Through a Visual Attention Mechanism” researches whether or not bilingual children have a greater improvement on memory when using these hand gestures and whether these gestures have a greater impact when processing speech in their less proficient language. Dr. Wakefield used eye-tracking technology to examine how bilingual children attended to different gestures while teaching. Dr. Wakefield’s hypothesis was that children would remember more from verbal speech in their stronger language due to the fact that there is an easier comprehension in the stronger language, but children would remember and process more with the visual aid of hand gestures in their weaker language due to weaker comprehension. She also mentioned that she felt that children would pay more attention to visual cues when processing speech in their weaker language as a compensation mechanism for understanding speech. This study also went through the process of matching gestures to verbal speech as well as mismatched gestures to verbal speech to see if it would enhance their comprehension. The findings showed that the children in this study relied more on gestures to understand or remember things when listening in their weaker language in comparison to their stronger language. Since they are less proficient in their weaker language, they rely on gestures to help them understand what is being said and to fill gaps in their comprehension if a word in the less proficient language was missed.
In “Neural Correlates of the Processing of Co-speech Gestures,” Dr. Henning Holle and his research team are researching something similar however they are incorporating different perspectives on gesture comprehension. The first perspective is called the action perspective and that involves the mirror neuron system. The action perspective claims that iconic gestures can engage the brain’s mirror neuron system which is involved in making and understanding actions. This research suggests that if iconic gestures are processed in the same way as actions that are observed, then the mirror neuron system plays a role in gesture comprehension. The other perspective is the multimodal perspective which suggests that gesture comprehension is not a spontaneous hand movement and that it involves multimodal integration that requires coordination between the auditory and visual processing. This study also goes into the differences between local and global integration. Local integration suggests that speakers tend to synchronize gestures with key ideas of their speech, while global integration happens over a longer period of time and involves the making up of meaning from multiple words and gestures across the entire sentence.
In conclusion, both research articles highlight the very important role of gestures in human interaction and communication. These studies show that gestures are essential to language processing and shaping how we interpret the meaning of what is said and how we interact with others. The integration of gestures and speech in the brain shows the importance and deep connection we feel with one another in communication. This highlights the idea that language is not limited to words alone but bodily expressions.
References:
Henning Holle, Thomas C. Gunter, Shirley-Ann Rüschemeyer, Andreas Hennenlotter, Marco Iacoboni,Neural correlates of the processing of co-speech gestures, NeuroImage, Volume 39, Issue 4,2008, Pages 2010-2024, ISSN 1053-8119,https://doi.org/10.1016/j.neuroimage.2007.10.055.
(https://www.sciencedirect.com/science/article/pii/S1053811907009895)
Zielinski, N., Wakefield, E.M., 2021. Language proficiency impacts the benefits of Cospeech gesture for narrative understanding through a visual attention mechanism. Proceedings of the Annual Meeting of the Cognitive Science Society 43 (43). htt ps://escholarship.org/uc/item/63r5d3qq.
No comments:
Post a Comment