Thursday, February 26, 2026

TBIs: Shaking up Executive Function and Auditory Processing

We learn more every year about the effects of traumatic brain injuries and their long-lasting aftereffects. Dr. Krizman, in her presentation a couple of weeks ago, highlighted the fact that Frequency Following Response (FFR) has a distinct neural signature in young children with concussions.  Even further, they noticed that their brains process the pitch of people's voices more slowly than that of children without traumatic brain injuries. Her research was a powerful demonstration of the lasting impact of traumatic brain injuries on auditory processing. I was interested in this research because we often study how TBI can impact cognitive function in the long term but seeing it from an auditory perspective was fascinating.  

Following this, Theodoroff et al. (2022) paper is building this, showing how even though there are lasting auditory effects to concussion and other TBIs, it lacks clinical practice. When getting a concussion screen, no section tests any auditory subtypes, such as tinnitus, noise sensitivity, or hearing difficulties. A new paper from the TRACK-TBI study, published in the Journal of Head Trauma Rehabilitation, continues Krizman's work by examining the long-term effects of untreated auditory symptoms following TBIs. Armstrong and colleagues are asking: What are the long-term consequences of ignoring auditory symptoms after any TBI, and how can this affect the patient's cognitive function? 

Using Krizman's paper as a basis, they observed a disrupted brainstem's response to the sound and how it differs from that of those without concussions. The work of Theodoroff and their team highlights the disparity and the lack of clinical knowledge used to treat the auditory symptoms of TBI's. Finally, Armstrong shows how it actively affects cognition in the long term.  

The TRACK-TBI study enrolled nearly 2700 participants with various TBIs from 18 Level I trauma centers across the country, and among these, they analyzed data from 1267 participants. Two weeks post-injury, they were asked, "Since your injury, has your hearing been worse in either ear?" and among the recipients, approximately 17% responded yes. 6 months post-injury, they were given a wide range of cognitive tests, focusing on executive function called the Trail Making Test (TMT) and processing speed using the WAIS-IV Processing Speed Index.  

After analyzing data from the remaining participants, they found that those with TBI-related hearing impairment at the two-week marker had significantly worse executive function at 6 months than those without hearing impairment. Those showing the most symptoms, hearing loss in both ears, and intracranial pathology in both ears, had the strongest association between the executive function level of hearing impairment following a TBI.  

Even though a connection was formed linking hearing impairment from a TBH to executive function issues, none were linked to processing speed. This distinction is important because it indicates that there are only sensitivities in higher-level cognitive functioning, such as the ability to switch between tasks or to inhibit distractions.  

         It was interesting to read this work and compare it with Dr. Krizman's because her work highlights that the brainstem's ability to encode FFRs for sound is damaged after a brain injury. Damage to this part of the brain makes it more difficult to distinguish one sound in a busy area, meaning that the cortex has to work much harder to differentiate frequencies. The Armstrong paper highlights this by showing that many people who experienced auditory symptoms are still experiencing executive function issues six months later. In both papers, degraded sensory input makes it more difficult for the brain to process higher-order cognitive tasks, such as executive function. In total, there needs to be more research surrounding the long-term effects of concussions on executive functions. Still, more extensive screening for TBI is needed to alleviate symptoms and develop better treatment protocols.  

 

The Impact of Gestures on Learning and Comprehension of a Language

     I had the pleasure of listening to Dr. Elizabeth Wakefield talk about her research in a seminar recently. Specifically, the work in her and Dr. Natalia Zielinski’s paper, “Language Proficiency Impacts the Benefits of Co-Speech Gesture for Narrative Understanding Through a Visual Attention Mechanism.” In the talk, she spoke about how language and gestures are linked together. When people talk, they use hand gestures naturally. Using hands or body when talking is known as co-speech, and these gestures can help people understand. An example proposed in the talk used a person describing someone putting on jewelry. The first video, without motion, said that they were putting on jewelry, while the second video used a motion of putting a ring on a finger during the sentence. Talking with the motion gives more context to what is being said and gives the listener the answer to what specific jewelry was being put on, without verbally mentioning a ring.

    In the paper previously mentioned, they studied Polish-English Bilingual children and tested language proficiency to see if there is any link between gestures and understanding a narrative. Stories were spoken in both languages, some with and some without gestures. During the study, a correlation was found between a weaker language (Polish) and a benefit from using hand gestures. When the children knew the language less, they relied more on gestures, and it helped them understand the story. In English, there was a slight negative impact of adding gestures to the narratives. This is caused by the children focusing too much on the gestures and not enough on the story presented. In Polish, a story without gestures was the lowest remembered of the tested categories. The study mainly looked at children due to their lower language proficiency. Language proficiency is seen in this study to be tied with the use of gestures to comprehend a narrative more compared to no gestures.

    In a recent study done by Xiaoyi Huang, Nayoung Kim, and Kiel Christianson, they looked at pairing a gesture with learning a foreign language. The participants were taught words in Mandarin, and a gesture was paired with the words. There were iconic gestures, such as talking on the phone or drinking, and arbitrary gestures that have no prior connection to a word. The thirty participants in the study had no previous experience speaking Mandarin or any Chinese language. Eighteen words were presented, with six being accompanied by iconic gestures, six with arbitrary gestures, and six without gestures. After two sessions, the students took a multiple-choice test with the instructor speaking the words and the gesture associated with them. Students were eight to ten percent better with gestures compared to without. According to the research Kiel Christianson, “visualizing a gesture with each word creates multiple pathways into the semantics of new words and helps students remember them better.” There is a fall off after twelve words, after which the students cannot learn and retain all those words. Overall, it is helpful for students to learn a new language while the teacher is using hand gestures.


References

Huang, X., Kim, N. and Christianson, K. (2019), Gesture and Vocabulary Learning in a Second Language. Language Learning, 69: 177-197. https://doi.org/10.1111/lang.12326

Zielinski, Natalia, and Elizabeth M. Wakefield. “Language Proficiency Impacts the Benefits of Co-Speech Gesture for Narrative Understanding through a Visual Attention Mechanism.” Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 43, no. 43, 2021, escholarship.org/uc/item/63r5d3qq.

Tuesday, February 24, 2026

The role of auditory processing in concussions, ADHD and APD

         One of the most common brain injuries seen are concussions, or mild traumatic brain injury (mTBI). Concussions can be a result of direct impact such as falling or sports collisions, or from indirect force such as whiplash where your head snaps forward or back. Despite the awareness of concussions, they tend to be difficult to diagnose as they do not show up on CT or MRI scans, they lack specific symptoms and could be misinterpreted as symptoms of other conditions, and they are subjective, or self-reported which may not be dependable and may be biased. It is imperative for concussions to be diagnosed, and an objective test be designed because of the potential for concussions to impact cognition, neurologic function, socio-emotional wellbeing, and academic achievement [1]. Without a diagnosis, concussions may go unnoticed and thus, untreated.  
     I recently had the pleasure of listening to Dr.Krizman speak during a neuroscience seminar on her phenomenal paper, “Auditory biological marker of concussion in children. This paper discusses how auditory processing, which can be disrupted by concussions, can be used to identify concussion occurrence and severity [1]. The study followed concussed and non-concussed child athletes and measured auditory processing by using the frequency-following response (FFR). FFRs can capture the fundamental frequency, harmonics, amplitude (intensity), timing, and noise. Fundamental frequency (distinguishing who is talking) and harmonics (distinguishing words) are found to be diminished in concussed patients. The study followed some of the concussed children from Lurie Children’s Hospital to observe recovery and found results of partial recovery. FFR provides clues for biological factors disrupted after a concussion; applying these findings allows the researchers to identify a signature neural profile and opens clinical potential for diagnosing and tracking concussions in patients. 
      Dr.Krizman’s talk made me aware of just how expansive auditory processing is. Auditory processing has many roles in language, reading, filtering, perception, and more. For those reasons, I wanted to know more about the impact of auditory processing in other conditions. While searching, I came across an article about attention deficit hyperactive disorder (ADHD) and auditory processing disorder (APD) and the link between them [2]. ADHD and APD can both affect attention, consequently contributing to the inaccuracy of diagnosis of both conditions. But why are they mistaken? The article discusses the 2017 study [3], “Auditory Processing Assessment in Children with Attention Deficit Hyperactivity Disorder”, where children with ADHD were found to not perform as well on auditory processing tests. However, when treated with methylphenidate (more common brand names are Ritalin or Concerta), ADHD children were found to perform as well as children without ADHD [2, 3]. Therefore, symptoms of challenges in auditory processing can be associated with ADHD. Consequently, APD symptoms can be misdiagnosed and interpreted as the individual having attention deficits. Both conditions require careful evaluation to address them. 
     From both the seminar talk and the article from Medical News Today, I have begun to realize the broader implications of auditory processing (listening behavior) being a direct reflection of neurological conditions in the brain rather than only being involved in hearing. Auditory processing involves auditory memory, attention, distinguishing sounds, blocking background noise, and location of sound. It spans many neural systems and thus, can be implicated in a variety of conditions. If multiple conditions exhibit similar auditory symptoms, clinicians face the challenges of accurately diagnosing. Dr.Krizman’s study [1] introduces a biological marker for concussions, which are known for being difficult to diagnose. It makes me wonder how an auditory biological marker can be applied to other conditions with auditory symptoms like those mentioned in the article by Medical News Today. 
 
References:
1. Kraus, N., Thompson, E. C., Krizman, J., Cook, K., White-Schwoch, T., & LaBella, C. R. (2016). Auditory biological marker of concussion in children. Scientific reports6, 39009.  
2. Villines, Z. (2021, November 9). ADHD and auditory processing disorder: Difference, diagnosis, and more. Medical News Today. https://www.medicalnewstoday.com/articles/adhd-and-auditory-processing-disorder 

3. Lanzetta-Valdo, B. P., Oliveira, G. A., Ferreira, J. T., & Palacios, E. M. (2017). Auditory Processing Assessment in Children with Attention Deficit Hyperactivity Disorder: An Open Study Examining Methylphenidate Effects. International archives of otorhinolaryngology21(1), 72–78.