Saturday, February 28, 2026

How Hand Gestures Strengthen Understanding and Persuasion

 How Hand Gestures Strengthen Understanding and Persuasion


Humans have developed many ways to communicate. Whether it's verbal communication, body language, or writing, humans have strived for connection and interaction. One of our greatest inventions is language, it is the conglomeration of all of these ways to communicate and we use it everyday. It plays an extraordinary role in the lives of every person on earth. Politicians use it to persuade, teachers use it to instruct, and healthcare professionals use it to save lives. It is a crucial instrument that we have been able to implement into every aspect of life. This post will explore the complexity of the interaction between verbal language and the use of hand gestures and discuss its further implications.


In a research paper titled, “Language Proficiency Impacts the Benefits of Co-Speech Gesture for Narrative Understanding Through a Visual Attention Mechanism,” Elizabeth Wakefield and Natalie Zalinski study the connections between co-speech hand gestures with bilingual participants. Co-speech gestures refer to gestures performed simultaneously to provide speech. The bilingual participants were children ranging from age 6-8 who speak English and Polish as their second language. The participants were shown videos with scripted verbal stories in both English and Polish. Three different trial conditions were used including speech only, matching gesture to speech, and mismatching gesture to speech for both languages. Tracking the eye movement of the participants was used to analyze where their attention was focused during each of the trials. The participants were then asked to recall the story to determine the comprehension of each trial. The main findings of this study were that matching gestures significantly improved recall in the weaker language, however matching gestures did not significantly boost recall in the stronger language. Furthermore, mismatching gestures did not help and sometimes even hurt recall. To expand on these findings, they help explain why children benefit from gestures more than adults. As children develop language skills, gestures are a valuable visual support that allows the children to see another dimension of the information being presented to them.1


Gestures can be useful for more than just comprehension. In a news article titled “Hand gestures that illustrate speech boost persuasiveness, study shows”

They cover a study from the Sauder School of Business in which it is shown that using gestures with your speech can contribute to the overall persuasion of your message.2 In this study, Giovanni Luca Cascio Rizzo, Jonah Berger, Mi Zhou, used a multiple methods to objectively measure and classify different types of hand gestures at scale, including: Automated video analysis of 2,100+ TED Talks, A large multimodal AI model analyzing ~200,000 video segments, and Preregistered controlled experiments. The findings suggest that more gesture contributes to greater persuasion. This is based on a few results. The first was based on analysis of videos, videos with greater gesturing obtained up to 5% more likes and positive comments. This theme persisted when researchers controlled for topic, length and other confounding variables. The researchers identified different types of hand movements which proved to have differences in their effectiveness to persuade. The first type of hand movement identified and the most effective for persuasion is illustrators. Illustrators visually represent exactly what's being said (comparable to the matching co-speech gestures in the bilingual comprehension study). For example if the presenter was talking about a sale increase they would trace a line up with their hands. The next type of hand movements were highlighters which tended to emphasize a point with more general movements, for example spreading their fingers and pushing their hands up to stress a point. And the final type was unrelated movements which had no communicative purpose, like fidgeting with their hands or making hand motions that didn't support the speech (comparable to the mismatched co-gestures in the bilingual comprehension study). So why does the illustrator type of gesture work? It can be broken down into a few factors. Firstly, Illustrators make content easier to understand by supporting speech with a direct and digestible visual aspect. This then translates into a presenter becoming more competent and credible to the audience as they are better able to understand. Finally the perceived credibility increases the persuasion of the speaker.3


These two articles show that co-speech gestures can be a crucial aspect for communication. It can both support the comprehension of weaker language speech and developing speech as is shown with the bilingual co-speech gesture research. It can also increase persuasion shown through the study of illustrative hand movements from speakers. This information has many implications. One example is its use in classrooms. If teachers are having trouble reaching their students, especially bilingual ones, it is definitely worth implementing more matching gestures with the material. Likewise in healthcare physicians often work with patients whose first language isn’t their own. Illustrative, co-speech gestures may be extremely beneficial to the patient's understanding. Whether it's the persuading of politicians, the teaching of students or saving lives, communication is a necessity and co-speech gestures should be used to support that communication.


References:

  1. Zielinski N, Wakefield E. Language Proficiency Impacts the Benefits of Co-Speech Gesture for Narrative Understanding Through a Visual Attention Mechanism. Published online 2021. 

  2. Leslie T. Hand gestures that illustrate speech boost persuasiveness, study shows. physics.org. 2025. Accessed March 1, 2026. https://phys.org/news/2025-11-gestures-speech-boost-persuasiveness.html. 

3. Cascio Rizzo GL, Berger J, Zhou M. Talking with your hands: How hand gestures influence communication. Journal of Marketing Research. Published online September 25, 2025. doi:10.1177/00222437251385922

Auditory processing as a building block to diagnose CTE

     The average viewership during NFL Sunday's is around 23.5 million viewers. These viewers watch over 2,000 different professional football players, and amongst this large number is a brain disease that is seemingly hidden to everyone. Chronic Traumatic Encephalopathy (CTE) is a neurodegenerative disease associated with a large number of repeated head impacts (e.g. a tackle in the NFL); a disease consisting of varying degrees of cognitive impairment, behavioral changes, and later on motor issues and dementia. With there being no current way to diagnose CTE antemortem, we constantly work on taking steps closer towards that goal. Dr. Krizman and other research studies have focused on new techniques and steps we can take to help achieve the diagnosis of CTE before death.

    I was fortunate enough to listen and learn about Dr. Krizman's study about the important research in the "Auditory biological marker of concussion in children." This paper talks about how auditory processing, in which can be disrupted by concussions, can also be used to identify concussion occurrence and severity. The study was then followed by concussed and non-concussed athletes measuring their auditory processing using a technique known as frequency following response (FFR). FFR acts as a biological marker by measuring how accurately and quickly the brain encodes sound, specifically fundamental frequency (F0) (acoustic cue responsible for perception of pitch, allowing to identify speakers in noisy environments). After an experiment conducted comparing FFR results between 20 children who have been diagnosed with a concussion and 20 healthy children, results showed that concussed children had approximately 35% neural responses to F0 compared to the healthy control group. Although the subjects of this experiment were children, the uniformity of the procedure across any age makes it likely to produce similar results with adults. With these consistent results, it can be concluded that concussions affect the brain's ability to encode information at the subcortical level which can potentially be a guide to identifying injury, injury prevention, and injury recovery. 

    While Dr. Krizman's study highlights the immediate neural slowness following a concussion using FFR, broader research into repetitive head impacts suggested that subconcussive hits (e.g. an NFL tackle) can lead to a higher accumulation of phosphorylated tau proteins. These proteins are triggered by constant trauma on the head, forming aggregates (also found in neurodegenerative diseases Alzheimer's and Parkinson's) that destroy neurons causing cognitive, mood, and behavioral issues. This protein is the hallmark of CTE and is known to be in affect within the subcortical regions and sulci of the cerebral cortex, areas that are functionally linked to the very auditory midbrain pathways measured by the FFR. Because the FFR provides a high-fidelity readout of axonal integrity and timing, it may serve as a critical "functional biopsy" of the brain. If tau pathology disrupts the microsecond-level precision required for the brain to encode the F0, the FFR could detect these degenerative changes years before behavioral symptoms of dementia or motor impairment appear.

    The objectivity of FFR is very valuable for keeping athletes safe. FFR cannot be faked or influenced by an athlete's desire to return to play, unlike very subjective symptom checklists that are very prone to "sandbagging." By establishing baseline FFR for all athletes during the beginning of their careers, doctors and researchers could monitor the neural results after every game and/or injury giving them greater precaution for CTE and other neurodegenerative problems. The leap from identifying acute concussions to tracking chronic degenerative represents a significant advancement toward a definitive, living diagnosis of CTE, which is currently seen as impossible.


References

Relationship Between Level of American Football Playing and Diagnosis of Chronic Traumatic Encephalopathy in a Selection Bias Analysis

Jessica LeClair, Jennifer Weuve, Matthew P Fox, Jesse Mez, Michael L Alosco, Chris Nowinski, Ann McKee, Yorghos Tripodis 

https://pmc.ncbi.nlm.nih.gov/articles/PMC9989358/


Auditory biological marker of 

concussion in children

Nina Kraus1,2,3,4, ElaineC.Thompson1,2, Jennifer Krizman1,2, KatherineCook5,6, TravisWhiteSchwoch1,2 & Cynthia R. LaBella5,6

Benefits of Co-Speech Gesture to Language Comprehension

     In her recent talk, Elizabeth Wakefield detailed her findings on the ways in which co-speech gesture can contribute to better understanding and processing of language. She explained that people often produce hand gestures to go along with their speech, sometimes without even realizing it. These gestures tend to be linked to our words and their meanings and are often used as a way to better convey a message. Through her research, Wakefield found that children were better receptive to these co-speech gestures than adults. To reach this conclusion, she, along with Natalia Zielinski, conducted a study with children who spoke both English and Polish, and tested their comprehension of stories told in both languages. When the children heard stories in Polish, which was determined to be their weaker language, the children were more likely to look at the gestures made by the storyteller in order to understand what they were being told. However, when the children heard a story in English, their attention was more focused on the speaker’s face than on their gestures. This finding pointed toward the idea that gestures are used to supplement a weaker understanding of language and could also be the reason that children tend to look at gesture more than adults. Their understanding of language is less developed, and so gathering visual information helps them better process what they hear. 

     Similarly, a study performed by Karin van Nispen et al. found that co-speech gestures were important for the comprehension of language, especially for individuals who had some sort of aphasia. For this study, participants who had no speech disorders were shown videos of either a person with an aphasia or a person without an aphasia telling a story in which the speaker performed gestures to go along with their speech. It was found that when the storyteller had a speech disorder, the listener paid more attention to their movements and gestures, as opposed to when the storyteller had fluent speech. When it became more challenging to understand the verbal information being presented, the observers tended to place more of their attention on the non-verbal gestures to gain information about the story they were listening to.

    The work done by Karin van Nispen et al. provides more evidence for co-speech gesture improving comprehension of language and builds off of the results found by Wakefield and Zielinski. It is clear that a listener would experience deficits in comprehension when presented with speech they are unfamiliar with, whether that be a second language or due to a speech disorder. While Wakefield found that children were more likely to attend to gestures while listening to unfamiliar speech, the data collected by Karin van Nispen et al. seems to show that adults can also benefit from gesture. This could be because an adult may not possess the same skills to fully comprehend speech due to aphasia, but they would have much deeper understanding when listening to speech produced by someone with no speech disorder. This is similar to the way in which a child might pay more attention to gesture, as their comprehension of language is not as fully developed as an adult’s, and it helps them gain more context into what is being said. Overall, it can be said that observation of co-speech gestures can help us to better understand language, especially if it is unfamiliar to us. 


References

Van Nispen, K., Sekine, K., Van der Meulen, I., Preisig, B. C. (2022). Gesture in the eye of the beholder: An eye-tracking study on factors determining the attention for gestures produced by people with aphasia. Neuropsychologia, Volume 174, 2022, 108315, https://doi.org/10.1016/j.neuropsychologia.2022.108315.

Zielinski, N., Wakefield, E.M., 2021. Language proficiency impacts the benefits of Cospeech gesture for narrative understanding through a visual attention mechanism. Proceedings of the Annual Meeting of the Cognitive Science Society 43 (43). https://escholarship.org/uc/item/63r5d3qq



Auditory Processing Serving as a Powerful Biological Marker for Neurological Health

Today, people rely on their senses to experience the beauty of nature and life. The sense of sound is essential for listening to music, people, and the world around them, to be able to make everlasting memories. However, sound is more than just hearing; it is a precursor to the brain's overall health and function. What if auditory processing can be used to identify neurological health in people, whether it's genetic or man-made? 
 
Not long ago, I had the opportunity to listen to Dr. Krizman's research presentation on concussions and their relevance to auditory processing. Dr. Krizman's focus was on seeing if brain responses to sound can be a way of identifying concussions in children. This was done by studying fundamental frequency (F0), also known as the pitch cue and the frequency following response (FFR) to see its effects on auditory processing when have undergone a concussion. The first finding was the effect of concussions on neural timing. A concussion reduces the coordination of firing neurons, which is essential to sound encoding. When this happens, brain responses become less synchronous, signals are weaker and delayed, and sound representation is decreased. Then the FFR was measured to see how the auditory brainstem encodes sound. This showed measurable changes in the weakening of the F0 pitch, a reduced response, and less accurate response tracking of the sound. Through these findings, concussions affect the brain's ability to encode information at the subcortical level which can potentially be a precursor to identifying injury and recovery. 
 
A research study that I came across called "Neural coding of formant-exaggerated speech and nonspeech in children with and without autism spectrum disorders" by researchers Chen et al., discussed how children with Autism Spectrum Disorder (ASD) process speech sounds at the neural level. This was done by examining typically developing (TD) children and children with ASD using the measurements of the frequency following response (FFR) to determine how the auditory system encodes sound. The first finding was that children with ASD lacked the automatic enhancement of amplifying their speech compared to TD children. TD children's exaggerated speech had a stronger neural response, and their brainstem responses became more accurate compared to the ASD children's neural responses not significantly increasing and their brainstem response to enhance wasn't present. This reflects the influence on the cortex, where normally, its enhancement towards the brainstem responses to meaningful speech. In children with ASD, the lack of enhancement suggests that speech may not be prioritized neurologically in the same way as in TD children. Ultimately, this can lead to a potential downward impact on language development for children with ASD. From the neural differences, the research can suggest early sensory contributions to language and differences in communication in autism. 
 
Although Dr. Krizman's study focused on an injury and Dr. Chen's was developmental, both studies showed how the auditory system can be sensitive to brain disruption, leaving speech processing vulnerable. In children with concussions, there was delayed neural timing in the brainstem affecting the encoding of speech and in children with ASD, their exaggerated speech wasn't able to make the neural enhancement that the TD children were able to because of their altered cortical use. This ultimately shows a comparison of neurodevelopment to neurotrauma aswell as it's affects in areas of the brain leading to implications. Through these findings in children, it can be used to potentially catch indications of neurological health. 
 
Kraus, N., Thompson, E. C., Krizman, J., Cook, K., White-Schwoch, T., & LaBella, C. R. (2016). Auditory biological marker of concussion in children. Scientific Reports6(1). https://doi.org/10.1038/srep39009 
‌Chen, F., Zhang, H., Ding, H., Wang, S., Peng, G., & Zhang, Y. (2021). Neural coding of formant‐exaggerated speech and nonspeech in children with and without autism spectrum disorders. Autism Research14(7), 1357–1374. https://doi.org/10.1002/aur.2509 


Hand Gestures bridges the gap for better communication

 Hand Gestures bridges the gap for better communication

The use of hand gestures is more common and subconscious than we think. Hand gestures are a highly developed linguistic nuance that aid individuals in understanding a subject or topic without the necessary language. Being an essential part of human communication, hand gestures provide a crucial way to convey messages and establish a connection and rapport [1]. While I was in a Neuroscience lecture, I had the opportunity to listen to Elizabeth M. Wakefield discuss her research, “Language Proficiency Impacts the Benefits of Co-Speech Gesture for Narrative Understanding Through a Visual Attention Mechanism.” In this research, we see an interplay between hand gestures as a compensatory mechanism that helps children better understand the topic or story. In this study, we are introduced to bilingual children trying to understand a foreign language through hand gestures [2]. The study used 17 children, who were given a specific task: to watch videos of different stories in their respective languages and to participate in analyses with matched and mismatched gestures. Redundant and parallel gestures matched the speech, allowing for a fluid understanding of the storyline. In the mismatched scenario, the children were presented with wrong and contradictory gestures. To analyze how the children comprehended the different stories, eye-tracking technology was used to monitor and measure their eye movements as they followed the hand movements. While listening to this study, I found it interesting how students had better comprehension of the story when they listened to it in their weaker, foreign language because of the intricate, comprehensive hand gestures. Using eye-tracking technology, it was found that when speaking a weaker language, children paid more attention to visual cues and hand gestures, and their gaze followed the narrator's hands, compared to when speaking in their native language.

I thought these results were very interesting, especially how hand gestures bridge the gap between language and comprehension. Beyond this, I was also interested in whether hand gestures can aid understanding in more quantitative STEM classes, especially those without abundant storytelling, like math.
Through this, I came across a study that examines the correlation between math and hand gestures and how such interventions help children better understand math concepts. The study “Helping hands: Encouraging parent-child gestures during informal mathematics activities” investigates how two interventions affect individuals' ability to understand and comprehend. This study groups parents into two conditions: the gesture condition and the talk condition, which serve as the experimental and control groups, respectively. In the gesture condition, parents are taught to gesture while explaining math-related analogies and conversations, whereas in the talk condition, we find greater emphasis on talking [3]. Models such as frequentist statistics were used to determine measurements and analyze data, including the proportion of hand gestures and the level of math taught. Two measurements and analyses were used: the first is the frequentist model, which is a standard measurement that equates hand gesture use, and the second is the Bayesian model, which correlates with the frequentist model, allowing researchers to be more confident in their conclusions.
In conclusion, from the seminar talk and the research papers, I believe that hand gestures are crucial for bridging the gap between language and communication and can also extend to more intricate and qualitative tasks, allowing for a better understanding of the topic. Reading such studies makes me wonder whether we can use specific gestures to improve children's math and science achievement. 

Citations 

[1] Atwell, N. (2025, September 1). Why are hand gestures important in communication?

- Complete guide | WordSCR. Word SCR. https://wordscr.com/why-are-hand-gestures-important-in-communication/


[2] Zielinski, N., & Wakefield, E. M. (2021). Language Proficiency Impacts the Benefits of Co-Speech Gesture for

Narrative Understanding Through a Visual Attention Mechanism. 2101-2107. Paper presented at 43rd Annual Meeting of the Cognitive Science Society:

Comparative Cognition: Animal Minds, CogSci 2021, Virtual, Online, Austria.


[3] Barkin, R., Grose, G. E., Krishnasamy, N., Scalise, N. R., & Ramani, G. B. (2025). Helping hands: Encouraging parent-child gestures during informal mathematics activities. Journal of Applied Developmental Psychology, 99, 101827. https://doi.org/10.1016/j.appdev.2025.101827


Concussions: Fundamental Frequency, Biomarkers, and Point-of-Care Testing

              Millions of concussions occur annually in the United States. These concussions can range in severity and can cause impacts ranging from changes in neurologic function to socioeconomic wellbeing, and sometimes the physical and neural effects may remain long after recovery (Kraus et al., 2016).

During Dr. Krizman’s talk at Loyola, she talks about her own research regarding concussions within athletes and children. She highlights the fact that there is no reliable test that can accurately identify a concussion and the range of its severity. Through her research, she has found that the auditory system could be a solution to diagnosing concussions.

Specifically, through Dr. Krizman’s research, her team focuses on the fundamental frequency of sound (F0), which is the main cue for everyday listening (Kraus et al). The team tested this by playing a fundamental frequency at 100 Hz, and a harmonic frequency from 200-1000 Hz. It was found that children with concussions had difficulty hearing the F0 in comparison to non-concussed children. Additionally, it was found that children in the concussed and non-concussed groups had the same reaction to the harmonic frequency, which demonstrates that F0 is the target frequency for diagnosing concussions.

Through her research, Dr. Krizman concludes that children with concussions have worse neural processing compared to children without concussions. This finding is based on the neural processing of the F0, and that the processing of the F0 relates to the severity of the concussion (Kraus et al., 2016).

However, while this study had reliable results, it is important to highlight that the study was published nearly a decade ago in 2016. This means that during 2016 until now, there may have been advancements in technology for diagnosing concussions or severity. In 2025, Dr. Kurup et al (2025) published a paper regarding a review of concussion biomarkers and point-of-care testing in 2025.  

In this paper, Kurup et al. (2025) reviewed concussion biomarkers and point-of-care testing from the years 2022-2023 using the National Institute of Health database. The three main categories of diagnosis were neuroimaging, neurologic screening tools, and molecular and protein biomarkers.

Neuroimaging is known for identifying brain damage and the ability to diagnose the severity of brain damage (Kurup et al., 2025). Some of the most common neuroimaging is done with computed tomography and magnetic resonance imaging. However, since traumatic brain injury severity has a wide spectrum, it can be difficult to translate the collected data into clinical care.

Neurologic screening tools are different tests and assessments like a Concussion Challenge Assessment and Sports Concussion Assessment Tool 3. These are used for broad results but fall short when being able to identify concussion symptoms. Neurologic assessments are best used in multifactorial models where they are paired with other categories of diagnosis to have a more accurate outcome prediction for patients (Kurup et al., 2025).

Lastly, molecular and protein biomarkers are useful in point-of-care because they provide evidence of injury during specific detection windows. Examples of biomarkers are blood, saliva, and cerebrospinal fluid, which can offer insight into traumatic brain injuries at molecular levels rather than only relying on symptoms (Kurup et al., 2025). Additionally, some protein biomarkers are involved with the blood-brain barrier such as glial and axonal markers which can indicate multiple concussions or other brain injuries like necrosis. However, a limitation that was noted was that each patient has their own distinctive symptoms and biomarkers can present differently and mean something else to each individual (Kurup et al., 2025).

Together, it is important to conclude from Dr. Krizman et al.’s (2016) F0 findings and Dr. Kurup et al.’s (2025) concussion biomarker and point-of-care testing review is that the most accurate and useful concussion evaluation is likely a multimodal, paired model. Krizman’s measure shows how an individual hears the fundamental frequency and how it changes when suffering from a concussion. On the other hand, Kurup’s review of neuroimaging, assessments, and biomarkers show evidence of injury and how to best develop a point-of-care. By combining both approaches, concussion diagnoses and recovery decisions could be optimized with molecular data related to diagnosis, injury severity, and patient treatment.

 

References

Kraus, N., Thompson, E., Krizman, J. et al. Auditory biological marker of concussion in children. Sci Rep 6, 39009 (2016). https://doi.org/10.1038/srep39009

Kurup MJ, Agrawal A, Temple SR, Galwankar S. Updated Review of Neurologic Concussion Biomarkers for Time-sensitive Point-of-care Testing. J Emerg Trauma Shock. 2025 Apr-Jun;18(2):74-89. doi: 10.4103/jets.jets_76_24. Epub 2025 Jun 19. PMID: 40666393; PMCID: PMC12258534.


Friday, February 27, 2026

Gestures Role in creating speech and enhancing how we process and describe information

 Recently I had the opportunity to listen to Dr. Elizabeth Wakefield present her findings on the impact of visual attention, gestures, and learning outcomes. It was concluded that children have greater learning benefits from speech and gesture cues than adults with established linguistic abilities. Gestures when incorporated into speech direct attention to specific tasks deepen understanding of the speaker's intended goal. This is a demonstration of the importance of multimodal teaching techniques for adolescents as it maintains and directs attention. It is an important insight into how gesture impacts visual attention and learning, but it brings up an interesting question regarding gesture's role in speech and language in addition to learning.  

    Goldin-Meadow and Alibali (2013) paper on Gesture’s role in speaking, learning, and creating language discusses how gesture contributes to communication skills or production of speech and organization of thoughts. They contend that when a speaker is asked to communicate a dense or difficult topic they often produce more gestures as a way to organize and clearly convey thought (Goldin-Meadow and Alibali 2.1). This is collaborative with Wakefield's finding that gesture not simply an addition to speech but also helps enhance learning and recognition. Goldin-Meadow and Alibali investigated the Information Packaging Hypothesis to test the relationship between gesture and speech. In this study participants were asked to describe shapes in an array of dots ranging from easily outlined to difficult to spot. This was a manipulation of the level of accessibility to the desired information. Participants that needed to describe information in a more difficult packing unit in turn used more gestures to identify the needed information. Similarly in Wakefield's paper it was found that adolescents were more accurately able to describe problem solving techniques when shown with gestures how to solve these problems (Wakefield 7). Gesture is used as a tool in these instances to assist accessing information presented in a difficult manner and enhancing learning. As a speaker, being able to convey ideas effectively is important to the listener. Additionally, the listener having visual cues to connect the speech to those ideas is very beneficial.

    I felt that Goldin-Meadow and Alibali's work added on to Wakefield's findings in a meaningful way that explains the relationship between gesture and speech. Wakefield's research focuses on how gesture's direction of visual attention aids in learning while Goldin-Meadow and Alibali explain how gesture allows for clearer organization and communication of thought. Gesture is a helpful organizational tool to supplement speech and should be encouraged. Gestures can be mutually beneficial for the speaker and learning parties as it allows for deeper understanding of what is being said. Both Wakefield and Goldin-Meadow and Alibali's work suggest that gesture is a valuable tool in cognition. As gesture supports visual attention and organization of thought, academic practices would likely benefit from meaningfully integrating gesture into curriculum. Gestures would likely assist students and professors when teaching or learning difficult topics. Gesture has proven to be an important tool and it is very interesting to explore its impact on cognition.

Zielinski, N., & Wakefield, E. M. (2021). Language Proficiency Impacts the Benefits of Co-Speech Gesture for Narrative Understanding Through a Visual Attention Mechanism. Proceedings of the Annual Meeting of the Cognitive Science Society43(43). https://escholarship.org/uc/item/63r5d3qq

Goldin-Meadow, S., & Alibali, M. W. (2013). Gesture’s Role in Speaking, Learning, and Creating Language. Annual Review of Psychology64(1), 257–283. https://doi.org/10.1146/annurev-psych-113011-143802