Whether it's a mad scientist gesticulating wildly while explaining his latest experiment or a teacher indicating the number to look at when instructing students on multiplication, hand gestures are used extremely often. While oft overlooked, co-speech gestures are very helpful and can even bridge the gap between spoken language and comprehension.
Through the study of bilingual children who spoke Polish and English, with a stronger proficiency in English, Zielinski, and Wakefield determined that gesture and memory recall are interrelated. They sought to investigate the effects of co-speech gestures on communication with an emphasis on whether or not co-speech gesture was necessary for language comprehension based on a listener's language ability. Two videos describing the plots of two cartoons were shown to children with varying gestures that either matched what the speaker was saying exactly, called a matching gesture, or a gesture that required a bit more interpretation from the viewer because it did not exactly match what the speaker was saying, called a mismatched gesture. The researchers looked at the children's recall of specific story points, concentrated eye movement, as well as how much their interpretation differed from the original speaker's iteration.
What the researchers found was that co-speech gestures that matched what the speaker was saying assisted listeners in comprehension when the speaker was presenting in their less proficient language. The same result was not evident when the presenters were speaking in English or using non-matching gestures.
Özçalışkan, Lucero, and Goldin-Meadow looked at the differences in language and sentence structure between Turkish and English children and analyzed the "silent gestures" that correspond to unspoken language. Through having children explain different actions, such as "she runs into the house", for example, speaking and using co-speech gestures as well as only using gestures, the researchers found that the hand movements used while speaking in Turkish and English differed. This is attributed to the fact that the way they spoke differed semantically. When speaking in English, sentences typically follow a "figure-motion-ground" order rather than a "figure-ground-motion" order. Even though the English and Turkish languages are semantically different, the researchers found that the silent gestures produced by the children were very similar across languages which indicates a consistent "language of gesture" across languages that develops from an early age.
Zielinski and Wakefield also found that in a child's less proficient language, more attention was placed on the speaker's gestures than when information was presented in their more proficient language. Synthesizing the results of these two bodies of work, it is possible to hypothesize that since there is this understood "language of gesture" that develops from an early age, it makes sense as to why children will pay attention to these gestures in a language that they are not proficient in. The gestures are speaking to them in a way that makes sense to them. These findings offer up many diverse avenues of study when it comes to verbal and non-verbal communication across cultures, no matter how different they might seem.
Sources:
Özçalışkan, Şeyda, et al. “What the Development of Gesture with and without Speech Can Tell Us about the Effect of Language on Thought.” Language and Cognition, 2023, pp. 1–22., doi:10.1017/langcog.2023.34.
No comments:
Post a Comment