Friday, February 27, 2026

The Cost Of Current Concussion Protocol

Every weekend thousands of kids participate in a multitude of sports ranging anywhere from basketball to field hockey. The outcome of the game is ambiguous, but the occurrence of a head injury is almost guaranteed. When a player gets hit in the head, coaches and parents alike look for obvious signs of a concussion- headache, concussion, dizziness, loss of consciousness, etc. Internal physiological assessment is not the current standard for a concussion diagnosis, but should it be? Are we missing out on crucial indications of brain injury by using physical symptoms as the only standard of diagnosis? 


In her recent talk, Dr. Krizman revealed that there are detectable physiological changes-identifiable by frequency-following Response (FFR)- in the brain of an individual who has experienced concussions; thereby suggesting that there are manifestations of a concussion beyond physical symptoms. Dr. Krizman challenged contemporary understanding of concussions, suggesting that even concussions with low reports of physical symptoms may be highly detectable in the brain signature; are we allowing athletes to return to play even when they are still neurologically unsound? More seriously has modern concussion protocol put athletes at higher risk of a traumatic brain injury (TBI)? 


Further research conducted by William T. O’Brien and colleagues have revealed venous blood levels of glial fibrillary acidic protein (GFAP) and neurofilament light (NfL)- as candidate biomarkers of concussions. Glial fibrillary acidic protein (GFAP) is a structural protein and component of glial cells. Following injury to the cell, GFAP leaks out and is detectable. Moreover, GFAP is a crucial indicator of astrocyte damage; high levels of GFAP reveal profound trauma to astrocytes. NfL- neurofibrillary light chain- is also a structural protein that resides in axons. It serves as the hallmark sign of axonal damage. William T. O’Brien and colleagues utilized these biological measures to observe athletes with sport related concussions (SRC) at 8 different points for up to 12 weeks. 


William T. O’ Brien et.al, found that GFAP levels notably spike in number, 24 hours after a concussion. Some athletes revealed prolonged elevation of GFAP for up to four weeks. Thus, indicating that concussions result in almost an immediate injury to cells, more specifically astrocytes. NfL levels differed temporally as they displayed a delayed onset. NfL levels typically peak between weeks one and four post concussion. A small population of athletes experienced ongoing NfL elevation for up to 12 weeks. These athletes had to wait notably longer to return to play. The researchers further associated loss of consciousness (LOC) with measured NfL and GFAP levels. Moreover, they found that athletes who lost consciousness showed higher elevation of NfL and GFAP.  


William T. O’ Brien et.al, identified heightened elevation of physiological biomarkers-GFAP and NfL- in concussed individuals. These biomarkers serve as an indication of traumatic injury to both astrocytes and axons. Their research reveals distinctive neurological changes that should undoubtedly be considered following sport related concussions (SRC). 


While Dr. Krizman discussed unique brain signatures identified in an individual who has had concussions, using FRP, William T. O’Brien and colleagues have further found blood-based biological markers- GFAP and NfL- which persist far beyond physical symptoms recovery. Both areas of research indicate that cellular injury remains despite reported physical recovery. Thereby stimulating conversations regarding current concussion protocol and the most effective/safe route of ensuring full recovery for athletes.  



References- 


Kraus N, Thompson EC, Krizman J, Cook K, White-Schwoch T, LaBella CR. Auditory biological marker of concussion in children. Sci Rep. 2016 Dec 22;6:39009. doi: 10.1038/srep39009. PMID: 28005070; PMCID: PMC5178332.


O’Brien WT, Spitz G, Xie B, et al. Biomarkers of Neurobiologic Recovery in Adults With Sport-Related Concussion. JAMA Netw Open. 2024;7(6):e2415983. doi:10.1001/jamanetworkopen.2024.15983


Thursday, February 26, 2026

TBIs: Shaking up Executive Function and Auditory Processing

We learn more every year about the effects of traumatic brain injuries and their long-lasting aftereffects. Dr. Krizman, in her presentation a couple of weeks ago, highlighted the fact that Frequency Following Response (FFR) has a distinct neural signature in young children with concussions.  Even further, they noticed that their brains process the pitch of people's voices more slowly than that of children without traumatic brain injuries. Her research was a powerful demonstration of the lasting impact of traumatic brain injuries on auditory processing. I was interested in this research because we often study how TBI can impact cognitive function in the long term but seeing it from an auditory perspective was fascinating.  

Following this, Theodoroff et al. (2022) paper is building this, showing how even though there are lasting auditory effects to concussion and other TBIs, it lacks clinical practice. When getting a concussion screen, no section tests any auditory subtypes, such as tinnitus, noise sensitivity, or hearing difficulties. A new paper from the TRACK-TBI study, published in the Journal of Head Trauma Rehabilitation, continues Krizman's work by examining the long-term effects of untreated auditory symptoms following TBIs. Armstrong and colleagues are asking: What are the long-term consequences of ignoring auditory symptoms after any TBI, and how can this affect the patient's cognitive function? 

Using Krizman's paper as a basis, they observed a disrupted brainstem's response to the sound and how it differs from that of those without concussions. The work of Theodoroff and their team highlights the disparity and the lack of clinical knowledge used to treat the auditory symptoms of TBI's. Finally, Armstrong shows how it actively affects cognition in the long term.  

The TRACK-TBI study enrolled nearly 2700 participants with various TBIs from 18 Level I trauma centers across the country, and among these, they analyzed data from 1267 participants. Two weeks post-injury, they were asked, "Since your injury, has your hearing been worse in either ear?" and among the recipients, approximately 17% responded yes. 6 months post-injury, they were given a wide range of cognitive tests, focusing on executive function called the Trail Making Test (TMT) and processing speed using the WAIS-IV Processing Speed Index.  

After analyzing data from the remaining participants, they found that those with TBI-related hearing impairment at the two-week marker had significantly worse executive function at 6 months than those without hearing impairment. Those showing the most symptoms, hearing loss in both ears, and intracranial pathology in both ears, had the strongest association between the executive function level of hearing impairment following a TBI.  

Even though a connection was formed linking hearing impairment from a TBH to executive function issues, none were linked to processing speed. This distinction is important because it indicates that there are only sensitivities in higher-level cognitive functioning, such as the ability to switch between tasks or to inhibit distractions.  

         It was interesting to read this work and compare it with Dr. Krizman's because her work highlights that the brainstem's ability to encode FFRs for sound is damaged after a brain injury. Damage to this part of the brain makes it more difficult to distinguish one sound in a busy area, meaning that the cortex has to work much harder to differentiate frequencies. The Armstrong paper highlights this by showing that many people who experienced auditory symptoms are still experiencing executive function issues six months later. In both papers, degraded sensory input makes it more difficult for the brain to process higher-order cognitive tasks, such as executive function. In total, there needs to be more research surrounding the long-term effects of concussions on executive functions. Still, more extensive screening for TBI is needed to alleviate symptoms and develop better treatment protocols.  

 

The Impact of Gestures on Learning and Comprehension of a Language

     I had the pleasure of listening to Dr. Elizabeth Wakefield talk about her research in a seminar recently. Specifically, the work in her and Dr. Natalia Zielinski’s paper, “Language Proficiency Impacts the Benefits of Co-Speech Gesture for Narrative Understanding Through a Visual Attention Mechanism.” In the talk, she spoke about how language and gestures are linked together. When people talk, they use hand gestures naturally. Using hands or body when talking is known as co-speech, and these gestures can help people understand. An example proposed in the talk used a person describing someone putting on jewelry. The first video, without motion, said that they were putting on jewelry, while the second video used a motion of putting a ring on a finger during the sentence. Talking with the motion gives more context to what is being said and gives the listener the answer to what specific jewelry was being put on, without verbally mentioning a ring.

    In the paper previously mentioned, they studied Polish-English Bilingual children and tested language proficiency to see if there is any link between gestures and understanding a narrative. Stories were spoken in both languages, some with and some without gestures. During the study, a correlation was found between a weaker language (Polish) and a benefit from using hand gestures. When the children knew the language less, they relied more on gestures, and it helped them understand the story. In English, there was a slight negative impact of adding gestures to the narratives. This is caused by the children focusing too much on the gestures and not enough on the story presented. In Polish, a story without gestures was the lowest remembered of the tested categories. The study mainly looked at children due to their lower language proficiency. Language proficiency is seen in this study to be tied with the use of gestures to comprehend a narrative more compared to no gestures.

    In a recent study done by Xiaoyi Huang, Nayoung Kim, and Kiel Christianson, they looked at pairing a gesture with learning a foreign language. The participants were taught words in Mandarin, and a gesture was paired with the words. There were iconic gestures, such as talking on the phone or drinking, and arbitrary gestures that have no prior connection to a word. The thirty participants in the study had no previous experience speaking Mandarin or any Chinese language. Eighteen words were presented, with six being accompanied by iconic gestures, six with arbitrary gestures, and six without gestures. After two sessions, the students took a multiple-choice test with the instructor speaking the words and the gesture associated with them. Students were eight to ten percent better with gestures compared to without. According to the research Kiel Christianson, “visualizing a gesture with each word creates multiple pathways into the semantics of new words and helps students remember them better.” There is a fall off after twelve words, after which the students cannot learn and retain all those words. Overall, it is helpful for students to learn a new language while the teacher is using hand gestures.


References

Huang, X., Kim, N. and Christianson, K. (2019), Gesture and Vocabulary Learning in a Second Language. Language Learning, 69: 177-197. https://doi.org/10.1111/lang.12326

Zielinski, Natalia, and Elizabeth M. Wakefield. “Language Proficiency Impacts the Benefits of Co-Speech Gesture for Narrative Understanding through a Visual Attention Mechanism.” Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 43, no. 43, 2021, escholarship.org/uc/item/63r5d3qq.

Tuesday, February 24, 2026

The role of auditory processing in concussions, ADHD and APD

         One of the most common brain injuries seen are concussions, or mild traumatic brain injury (mTBI). Concussions can be a result of direct impact such as falling or sports collisions, or from indirect force such as whiplash where your head snaps forward or back. Despite the awareness of concussions, they tend to be difficult to diagnose as they do not show up on CT or MRI scans, they lack specific symptoms and could be misinterpreted as symptoms of other conditions, and they are subjective, or self-reported which may not be dependable and may be biased. It is imperative for concussions to be diagnosed, and an objective test be designed because of the potential for concussions to impact cognition, neurologic function, socio-emotional wellbeing, and academic achievement [1]. Without a diagnosis, concussions may go unnoticed and thus, untreated.  
     I recently had the pleasure of listening to Dr.Krizman speak during a neuroscience seminar on her phenomenal paper, “Auditory biological marker of concussion in children. This paper discusses how auditory processing, which can be disrupted by concussions, can be used to identify concussion occurrence and severity [1]. The study followed concussed and non-concussed child athletes and measured auditory processing by using the frequency-following response (FFR). FFRs can capture the fundamental frequency, harmonics, amplitude (intensity), timing, and noise. Fundamental frequency (distinguishing who is talking) and harmonics (distinguishing words) are found to be diminished in concussed patients. The study followed some of the concussed children from Lurie Children’s Hospital to observe recovery and found results of partial recovery. FFR provides clues for biological factors disrupted after a concussion; applying these findings allows the researchers to identify a signature neural profile and opens clinical potential for diagnosing and tracking concussions in patients. 
      Dr.Krizman’s talk made me aware of just how expansive auditory processing is. Auditory processing has many roles in language, reading, filtering, perception, and more. For those reasons, I wanted to know more about the impact of auditory processing in other conditions. While searching, I came across an article about attention deficit hyperactive disorder (ADHD) and auditory processing disorder (APD) and the link between them [2]. ADHD and APD can both affect attention, consequently contributing to the inaccuracy of diagnosis of both conditions. But why are they mistaken? The article discusses the 2017 study [3], “Auditory Processing Assessment in Children with Attention Deficit Hyperactivity Disorder”, where children with ADHD were found to not perform as well on auditory processing tests. However, when treated with methylphenidate (more common brand names are Ritalin or Concerta), ADHD children were found to perform as well as children without ADHD [2, 3]. Therefore, symptoms of challenges in auditory processing can be associated with ADHD. Consequently, APD symptoms can be misdiagnosed and interpreted as the individual having attention deficits. Both conditions require careful evaluation to address them. 
     From both the seminar talk and the article from Medical News Today, I have begun to realize the broader implications of auditory processing (listening behavior) being a direct reflection of neurological conditions in the brain rather than only being involved in hearing. Auditory processing involves auditory memory, attention, distinguishing sounds, blocking background noise, and location of sound. It spans many neural systems and thus, can be implicated in a variety of conditions. If multiple conditions exhibit similar auditory symptoms, clinicians face the challenges of accurately diagnosing. Dr.Krizman’s study [1] introduces a biological marker for concussions, which are known for being difficult to diagnose. It makes me wonder how an auditory biological marker can be applied to other conditions with auditory symptoms like those mentioned in the article by Medical News Today. 
 
References:
1. Kraus, N., Thompson, E. C., Krizman, J., Cook, K., White-Schwoch, T., & LaBella, C. R. (2016). Auditory biological marker of concussion in children. Scientific reports6, 39009.  
2. Villines, Z. (2021, November 9). ADHD and auditory processing disorder: Difference, diagnosis, and more. Medical News Today. https://www.medicalnewstoday.com/articles/adhd-and-auditory-processing-disorder 

3. Lanzetta-Valdo, B. P., Oliveira, G. A., Ferreira, J. T., & Palacios, E. M. (2017). Auditory Processing Assessment in Children with Attention Deficit Hyperactivity Disorder: An Open Study Examining Methylphenidate Effects. International archives of otorhinolaryngology21(1), 72–78. 


Thursday, December 11, 2025

What we don't know about dreams

I had the good luck to attend a class where Dr. Gabriela Torres spoke about her research on dreaming and what happens in the human brain during dreams. Due to my interest in this subject, I have made the decision to learn more about it and present my findings today. Because dreams could only be researched after awakeningwhich I believe is fantastic, dream science was unreliable and incomplete for most of history. Dr. Torre's research and other studies, however, are changing this by showing that dreams are not closed off experiences but rather measurable and, more significantly, interactive brain states. 

Before speaking in my Neuro 300 seminar class, Dr. Torres presented a study that shows how REM sleepers can use eye movements or facial muscle signals to receive questions and answers in real time. This implies that while completely asleep, the dream brain is capable of understanding speech, carrying out mental tasks, and purposefully responding. Their research demonstrates that dream cognition is more complex as well as well-organized and connected to reality than previously thought. 

I became interested in this because, when doing my own research, I found that a supplementary study by Horikawa, Tamaki, Miyawaki, and Kamitani (2013) further supports this. They used fMRI and machine-learning models to directly translate the visual content of dreams from brain activity. Because the same visual regions that are active during waking perception also become active during dream imagery, researchers were able to predict what patients were dreaming before they woke up. This shows that the brain's neuronal architecture for dream imagery is the same as that of real perception. 

When taken as a whole, these investigations offer a convincing conclusion, which is that dreams are not personal hallucinations but rather readable and interactive in cognitive states. While Konkoly (in the study, Dr. Torres presented) showed that dreamers can communicate openly, Horikawa, Tamaki, Miyawaki, and Kamitani (2013) (in the study I found) showed that researchers can analyze dream material inwardly. These findings support the idea that consciousness is not an "on and off" switch but rather a fluid spectrum, with elements of waking perception, memory, and reasoning staying partially active even during REM sleep. This new science may be used to treat PTSD and nightmares, as well as to enhance creativity, memory, and emotional processing while you sleep. Together, these investigations reveal a link between neurological interpretation and discourse, transforming dreams from a mysterious internal experience into a measurable and reachable part of human thought. 

 

APA Citations  

Horikawa, T., Tamaki, M., Miyawaki, Y., & Kamitani, Y. (2013). Neural decoding of visual imagery during sleep. Science, 340(6132), 639–642..https://pubmed.ncbi.nlm.nih.gov/23558170/ 

Konkoly, K. R., Appel, K., Chabani, E., Mangiaruga, A., Gott, J., Mallett, R., Caughran, B., Witkowski, S., Whitmore, N. W., Mazurek, C. Y., Berent, J. B., Weber, F. D., Türker, B., Leu-Semenescu, S., Maranci, J.-B., Pipa, G., Arnulf, I., Oudiette, D., Dresler, M., & Paller, K. A. (2021). Real-time dialogue between experimenters and dreamers during REM sleep. Current Biology, 31(7), 1417–1427.e6. 


Temperature as a Universal Regulator of Sleep

    Humans spend around one-third of their life sleeping. Sleep plays a huge role in memory consolidation, waste removal, cellular repair, and much more. Poor quality sleep is associated with problems such as dementia and CVD. Even with sleep playing such a big role in our lives, most of us neglect sleep, getting about 3 and a half ultradian sleep cycles or even less. Sleep is heavily regulated by many factors, with temperature being one of the biggest factors. In order for humans to fall asleep, their core temperature must drop by 1–3 °F. The timing of sleep is controlled by a circadian clock, the suprachiasmatic nucleus (SCN) in humans, and lateral posterior clock neurons (LPNs) in Drosophila. These structures are strongly influenced by temperature, and as they receive temperature information, they release signals that regulate sleep.

    Daniel J. Cavanaugh’s paper, “The cell-intrinsic circadian clock is dispensable for lateral posterior clock neuron regulation of Drosophila rest-activity rhythms,” written by Charlene Y.P. Guerrero et al., examines the role of LPNs in Drosophila circadian and sleep regulation. LPNs receive input from thermosensory pathways, which is essential for circadian rhythm regulation. Guerrero reports that reduced LPN excitability causes a slight reduction in sleep. However, when LPNs are silenced using CRISPR, total sleep is reduced, sleep becomes fragmented, and sleep bouts become shorter. Researchers then used CRISPR to delete clock genes (per and tim) inside LPNs, which normally create a 24-hour cycle acting as an internal clock. The flies with their LPN clock genes deleted still received timing information from other clock neurons with functional per and tim cycles, maintaining a normal 24-hour rhythm. This shows that LPNs do not set their own time but instead receive timing from other structures, functioning as driven oscillators. Guerrero’s article demonstrates that LPNs are important for regulating sleep but do not require their own internal clocks. This knowledge may lead to therapeutic strategies for circadian rhythm disorders.

    In a study by Roy J. E. M. Raymann et al. titled “Skin deep: enhanced sleep depth by cutaneous temperature manipulation,” Raymann discusses how skin temperature controls sleep quality in humans. Small increases in skin temperature have significant effects on sleep. Raymann et al. argue that adjusting the thermostat in a room has limited impact on sleep, while manipulating skin temperature has powerful effects. Previous experiments show that even a 0.4–1.0°C increase in skin temperature affects sleep. Researchers used slight skin warming and adjustments to the bedding microclimate to observe how it influenced sleep. They recorded sleep onset time, deep sleep duration, interruptions, and comfort. Participants also slept in cool and hot rooms to test whether air temperature made a difference. Results showed that in hot rooms, deep sleep decreased, wakefulness increased, and overall sleep quality worsened. When skin temperature was raised independently of room temperature, results changed dramatically: participants fell asleep faster, experienced more deep sleep, and had overall more restorative sleep. The study concludes that skin temperature plays a larger role in sleep quality than room temperature.

    Together, these findings show that temperature plays a major role in regulating sleep. In Drosophila, LPNs integrate thermosensory information to regulate sleep even without their own internal clocks. In humans, manipulation of skin temperature affects sleep more strongly than room temperature. Both articles highlight that sleep systems depend heavily on how the brain interprets thermal information. As research in this field continues, new strategies and treatments for sleep and circadian rhythm disorders may emerge.


References

1.) Guerrero, C. Y. P., Lam, V. H., Cusumano, P., Van Doren, M., & Cavanaugh, D. J. (2025). The cell-intrinsic circadian clock is dispensable for lateral posterior clock neuron regulation of Drosophila rest–activity rhythms. Neurobiology of Sleep and Circadian Rhythms. https://doi.org/10.1016/j.nbscr.2025.100198

2.) Raymann, R. J. E. M., Swaab, D. F., & Van Someren, E. J. W. (2008). Skin deep: Enhanced sleep depth by cutaneous temperature manipulation. Brain, 131(2), 500–513. https://doi.org/10.1093/brain/awm315