To make a gesture is to move some part of your body, usually one’s hand, in order to express an idea or give meaning to something. People tend to use gestures subconsciously while speaking so that they can convey emotions nonverbally or elaborate on topics that are difficult to express in their wording when speaking. Overall, gestures are typically observed by an audience to make more sense of what a speaker is trying to convey, especially if it is something challenging to understand. Some examples of gestures include pointing with your fingers, motioning with your hands, crossing your arms, nodding your head, etc. Although gestures already help with everyday conversations, recent studies have shown that using gestures when speaking is more important than it may seem.
The effectiveness of co-speech gesture in enhancing
narrative comprehension was studied in “Language Proficiency Impacts the
Benefits of Co-Speech Gesture for Narrative Understanding through a Visual
Attention Mechanism” by Dr. Elizabeth Wakefield and researcher Natalia
Zielinski. However, they specifically studied how language proficiency helps in
the effectiveness of using these co-speech gestures towards bilingual children.
By observing Polish-English bilingual students and their attention to gestures
made by actors who recited two stories in both Polish and English, as well as
testing the students’ recall of the stories in both languages, it was found
that co-speech gestures were more beneficial in aiding the comprehension of the
students’ weaker language than that of their primary language. Since the
students are not as fluent in English compared to Polish, they relied on the
gestures the actors made when reciting the stories in Polish in order to fill
in the gaps of parts they were not able to understand in the story; this was
proven during their recall when the students would misremember parts of the
story to be related to the gesture given by the actor rather than what was
actually said out loud.
On the flip side, a related a research article titled
“Gesture in the eye of the beholder: An eye-tracking study on factors
determining the attention for gestures produced by people with aphasia,”
researchers Karin van Nispen, Kazuki Sekine, Ineke van der Meulen, and Basil C.
Preisig studied how people with aphasia use their hand gestures when speaking
compared to non-brain damaged speakers. Additionally, healthy individuals paid
more attention to the gestures made by those with aphasia compared to those who
don’t, due to their verbal production deficits. Aphasia is a language disorder,
caused by brain damage typically from a stroke, head injury, or tumor, that
affects a person’s ability to understand or express speech. Essentially, the findings
of this research supported the idea that language that proved to be relatively
challenging to comprehend resulted to the listener being more focused on the
non-verbal gestures and the speaker using more non-verbal gestures in order for
the two to understand each other, even with the challenges between their verbal
communication.
Though the study by Dr. Wakefield and by van Nispen et
al. focus on different aspects of language comprehensibility, both studies
highlight the crucial importance of gestures in bridging communication gaps.
Whether in the context of multilingualism or speech disorders, gestures serve
as a necessary tool for enhancing the understanding of verbal language when it by
itself is insufficient. These findings are then able to be applied in the real
world beyond individual conversations, especially as people’s attention spans
continue to decline. Perhaps, future research can focus on how gestures improve
engagement and focus in classrooms, particular for younger students and those
with learning differences. And with artificial intelligence being used more and
more in school settings, it is possible for developers to even utilize this to
improve online learning, speech therapies, and human-computer interactions. Ultimately,
the ongoing research on gestures and language continue to help improve verbal
and non-verbal communication, fostering a greater connection and inclusivity in
society despite the presence of language barriers.
References
van
Nispen, K., Sekine, K., van der Meulen, I., & Preisig, B.C. (2024). Gesture
in the eye of the beholder: An eye-tracking study on factors determining the attention
for gestures produced by people with aphasia. Neuropsychologia, 174. https://doi.org/10.1016/j.neuropsychologia.2022.108315
Zielinski,
N., & Wakefield, E.M. (2021). Language Proficiency Impacts the Benefits of
Co-Speech Gesture for Narrative Understanding through a Visual Attention Mechanism.
Proceedings of the Annual Meeting of the Cognitive Science Society, 43. https://escholarship.org/uc/item/63r5d3qq
No comments:
Post a Comment