Showing posts with label Auditory Cortex. Show all posts
Showing posts with label Auditory Cortex. Show all posts

Wednesday, December 15, 2021

BCI using RNN and ECoG devises to Produce Sounds Related to the Computational Simulation of the Human Auditory Pathway in Understanding Phonetic Acquisition

The auditory system is a complex and fragile network of structures working together to perceive, process, and encode sound. Deciphering different phonetic sounds and classifying those units has been an effort many researchers have worked on. However, Dr. Dematties in the research article “Phonetic acquisition in cortical dynamics, a computational approach” examines how linguistic units like phonemes are encoded and classified to form complex acoustic streams in speech data. Additionally, infants can differentiate sounds of words from a complex audio stream through recognizing patterns in speech. The research is accomplished through creating 500 words with different sounds and lengths, CSTM, which stimulates cortical tissues to mimic the respected sounds. They use multiple processes that stimulate the growth of the distal dendritic branch synapsis, thus, allowing for synapses to only be created on pyramidal cells and biases the process of activation in each respective neuron. This approach allows the researcher to control variations in levels of reverberation, noise, and pitch. Furthermore, multiple algorithms are used to activate auditory neurons in order to create the correct phonemes, words, and sounds that can be encoded. The research concludes that through the use of computational simulation, the neurophysiological and neuroanatomical data of the human auditory pathway is able to mimic incidental phonetic acquisition observed in human infants, which is a key mechanism involved during early language learning. The authors propose that these algorithms can be used in creating more efficient and complex AI speech generators and programs that recognize or translate speech. 

Through the utilization of new technology and AI algorithms such as the ones Dematties produced, Neurologist can create brain-computer interfaces (BCI) for mute people that translate neurological and cortical language signals into electro-stimulation produced synthetic speech. Dematties work could accompany this research to achieve the same results in deaf patients as well. To achieve this, Anumanchipalli et al used an approach that used a two-stage decoding approach. Neural signals are translated into representations of movements of vocal-tract articulators into spoken sentences through the use of recurrent neural networks (RNN) and an electrocorticography (ECoG) device. Using a two-stage approach resulted in less acoustic distortion than using direct decoding of acoustic features. The authors argue that “If massive data sets spanning a wide variety of speech conditions were available, direct synthesis would probably match or outperform a two-stage decoding approach” (Pandarinath et al. 2019). Due to the creation of these algorithms by Dematties, direct synthesis is a greater possibility with the utilization of AI in speech and auditory processing. 

Furthermore, due to the development of BCIs, through the use of AI and computational analytical algorithms, new forms of utilization of this technology have been considered for the control of arm and hand movements and in humans with paralysis. Trials have successfully demonstrated that the rapid communication, control of robotic arms, and restoration of sensation and movement of paralyzed limbs in humans using these BCIs is possible.



References


Dematties D, Rizzi S, Thiruvathukal GK, Wainselboim A, Zanutto BS (2019) Phonetic acquisition in cortical dynamics, a computational approach. PLoS ONE 14(6): e02117966. https://doi.org/10.1371/journal.pone.0217966 


Anumanchipalli, G.K., Chartier, J. & Chang, E.F. “Speech synthesis from neural decoding of spoken sentences”. Nature 568, 493–498 (2019). https://doi.org/10.1038/s41586-019-1119-1 








Friday, October 22, 2021

There are More Neural Circuits for Inference Making than was Previously Thought

    Since the dawn of time, organisms have been making snap decisions based on what can be inferred from the environment around them. Humans are no exception, as our world is constantly changing around us. New Research has been done to see how we make those decisions based on what we can infer. In "Neural circuits for inference-based decision-making," Wang and Kahnt examine which areas of the brain could be used for inference during the use of learning associations. There are three areas that they found that play a role in inference making; the orbitofrontal cortex, the hippocampus, and the amygdala. Orbitofrontal cortex to hippocampus circuits seem to be used for inferring value of outcomes. Wang and kahnt's research has given us a glimpse into what parts of our brain make snap decisions when involving inferences.
    In "A cortical circuit mechanism for structural knowledge-based flexible sensorimotor decision-making." researchers Xin et al., suggested that there is another area of the brain involved in inference making. They suggested that the auditory cortex has been proven to encode information involving stimulus categorization. Xin et al. made use of two-photon imaging to find the area of the auditory cortex that is responsible for stimulus categorization. This area receives projections from the orbitofrontal region previously talked about. These projections played an important role specifically in stimulus re-categorization, which allows for behavioral flexibility when it comes to inferring. This area does not seem to play a role in discriminating between stimuli though.
    The finding of another area in the brain that has an impact on our inferencing abilities has a couple of possible implications. One of these implications is that all of our senses have circuits that allow for stimulus categorization. This would help us keep all of the stimuli straight inside of our heads. Another of these implications is that organisms evolved to have multiple circuits for inferencing so that we can have other ways of processing incase one gets damaged. This would also help us process information faster. This would make sense, as sometimes the ability to infer means the difference between life and death for us. The possibility of the existence of other neural circuits being used for inference making should be looked at more in the future. Who knows, we might be able to speed up our inference making even more.

Works Cited

Liu, Y., Xin, Y., & Xu, N.-long. (n.d.). A cortical circuit mechanism for structural knowledge-based flexible sensorimotor decision-making. Cell.com. Retrieved October 22, 2021, from https://www.cell.com/neuron/fulltext/S0896-6273(21)00280-4?_returnURL=https%3A%2F%2Flinkinghub.elsevier.com%2Fretrieve%2Fpii%2FS0896627321002804%3Fshowall%3Dtrue. 

Wang, F., & Kahnt, T. (2021). Neural circuits for inference-based decision-making. Current Opinion in Behavioral Sciences, 41, 10–14. https://doi.org/10.1016/j.cobeha.2021.02.004