Friday, February 28, 2025

Increasing the Understanding of Co-Speech Gesture

The term “co-speech gesture” refers to the act of gesturing while speaking as a means to better convey ideas. This is an aspect of speech that is being increasingly studied as emphasis on communication as a multimodal phenomenon grows. In “Language Proficiency Impacts the Benefits of Co-Speech Gesture for Narrative Understanding Through a Visual Attention Mechanism” by Dr. Natalia Zielinski and Dr. Elizabeth M. Wakefield (Wakefield & Zielinski 2021) , the effect of co-speech gesture on narrative comprehension and recall in bilingual children is examined.

 Participants consisted of 17 Polish-English bilingual children between the ages of 6 and 8. Most were stronger in English over Polish. The children were presented with the same story told in both English and Polish. The stories were accompanied with “matching” and “mismatching” gestures. Respectively, these terms describe gestures that did or did not show relevance to the narrative in question. Following the storytelling, the children were asked to remember what they could about what happened in the story. The data showed that matching gestures improved recall of the story told in the participants’ weaker language, but not in their stronger language. This improvement was not seen with the use of mismatching gestures (Wakefield & Zielinski 2021).

The paper “The connectivity signature of co-speech gesture integration: The superior temporal sulcus modulates connectivity between areas related to visual gesture and auditory speech processing” by Benjamin Straube et al. seeks to elucidate a neural mechanism for processing co-speech gesture (Straube et al. 2018).  This study relied on dynamic causal modeling for fMRI to explore connections between previously noted areas involved in the processing of multisensory information, including the middle temporal gyrus (MTG), the occipital cortex (OC) and the posterior superior temporal sulcus (STS). A group of 20 adult participants were shown videos of relevant gestures with accompanying German (GG) or Russian (GR) speech, or videos of the model only speaking German sentences (SG). While undergoing fMRI, participants were tasked with pressing a button to indicate whether the speech was object- or person-related. They were then given a paper-and-pencil test that required them to recognize sentences from the videos. Finally, they completed a self-report test on their own usage of gesture, as well as their behavior during the experiment. The data collected was used to produce a model of the connectivity between the MTG, the OC, and the STS (Straube et al. 2018).

These two papers alone highlight a great deal about the function of and mechanisms behind co-speech gesture. Nonverbal communication is extremely powerful and a major aspect of how humans understand and relate to each other. A better understanding of co-speech gesture will lead to a better understanding of human communication as a whole. 


Zielinski, Natalia, and Elizabeth M. Wakefield. “Language Proficiency Impacts the Benefits of Co-Speech Gesture for Narrative Understanding through a Visual Attention Mechanism.” Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 43, no. 43, 2021, escholarship.org/uc/item/63r5d3qq. Accessed 1 Mar. 2025.


Straube, Benjamin, et al. “The Connectivity Signature of Co-Speech Gesture Integration: The Superior Temporal Sulcus Modulates Connectivity between Areas Related to Visual Gesture and Auditory Speech Processing.” NeuroImage, vol. 181, Elsevier BV, July 2018, pp. 539–49, https://doi.org/10.1016/j.neuroimage.2018.07.037. Accessed 1 Mar. 2025.



 

No comments:

Post a Comment