Thursday, October 12, 2023

How Gestures are Being Studied to Find Ways to Improve BCIs

Gesture is used in everyday language; we see it all the time. While we may not really realize how much of an impact gesture has on us, it is a very important part of our everyday learning and comprehension of language. The question of its role in our communication has been the topic of study of many researchers lately. So much, that it has been studied in AI, in hopes to improve Brain-computer interfaces.  

In the article “Language Proficiency Impacts the Benefits of Co-Speech Gesture for Narrative Understanding Through a Visual Attention Mechanism”, Zielinski and Wakefield studied the questions of whether children benefit more from having gesture incorporation in communication than adults due to their language proficiency being weaker, and their reliance on gesture to help them comprehend more. Also, the question of whether children are more likely to attend to the gesture, when they are less proficient in a language. The main point of their study was to keep exploring the impact of co-speech gesture on communication. They mainly focused on the language proficiency of the listener and the impact on their memory. The researchers found that gesture boosts children’s likelihood of remembering information from a story in the child’s weaker language when the gesture is a matching gesture.  

So, what does the brain see when we look at gesture? The article “AI Deep Learning Decodes Hand Gestures from Brain Images” talks about a study that was published in “Cerebral Cortex”, that shows how AI can decode hand gestures from brain images using MEG, a noninvasive imaging method. This study is clearly very different from Zielinski and Wakefield’s in that they were looking to see how gestures impacted memory and learning of a story. This study focuses on an algorithm to learn to classify the gestures used.  

Magnetoencephalography (MEG) is a noninvasive neuroimaging method used to map brain activity. It works by measuring the magnetic fields that are produced by the brain’s electrical currents. Using MEG, researchers can get real time tracking of brain activity. In this study done by UC San Diego researchers, 12 participants were asked to wear a helmet that was able to sense the magnetic fields produced by their brains, while they were randomly told to make a rock, paper, or scissors hand gesture. The helmet collected images of the brain activity while the gestures were made.  

Using an AI convolutional neural network deep learning algorithm, the researchers were able to learn to classify the gestures and their patterns in the brain activity images. With results from this study, the researchers have data that can help Brain-computer interfaces be more realistic, which will be able to help the paralyzed much better in the future. These two studies show that gesture is very important to our everyday language, and hopefully one day will be able to help more people communicate more efficiently. 


References:  

Zielinski, N., & Wakefield, E. (n.d.). Language Proficiency Impacts the Benefits of Co-Speech Gesture for Narrative Understanding Through a Visual Attention Mechanism.  

Rosso, C. (2023, May 21). AI Deep Learning Decodes Hand Gestures from Brain Images. Psychology    Today. https://www.psychologytoday.com/us/blog/the-future-brain/202305/ai-deep-learning-

decodes-hand-gestures-from-brain-images

  

No comments:

Post a Comment