It is unbelievable on how much the brain can do for us. It
enables the person to experience the reality of the world around us through our
senses. But perhaps one of the senses we take for granted is our hearing. Without
it, we obviously would not be able to hear music, communicate efficiently, warn
us of potential danger, and even help distinguish where the sources of sound is
coming and its location. If it wasn’t for our vision to actually see and
localize objects, we would primarily rely on our hearing.
How are we able to localize a sound? Due to the human anatomy, the head is
bilateral with the ears at opposite ends. The sound at a given position will
travel towards one of the ear at an earlier time and the other ear at a later
time. The time difference of the sound is detected and processed to give the
direction of the sound source. This is called the interaural time difference
(ITD). Also, the difference in sound pressure, the interaural level difference
(ILD), is used in accordance in finding direction of source. Based on the
research conducted by Dr. Dye and his colleagues, echoes have a masking effect
on the source of the sound when the interval between the two falls between 8-64
ms. The echoes itself are considered the new sources of sound, which demands
processing resource, thus, the original source of the sound is essentially deleted
so that the brain can process this “new” source of sound.
So how can someone utilize this information? Many people who
have lost their sight rely on their hearing to localize their surroundings.
Some have it more developed than others to the point where echolocation is used.
One notable person who uses this is Daniel Kish. He was born with bilateral
retinoblastoma, a cancer in the eye, which resulted in his eyes being removed
at the age of seven. As time when on, he learned how to move around in his
environment by clicking sounds with his tongue. As he creates a new source of
sound in his mouth, he also creates an echo that bounces in the environment and
back into his ear. The echoes itself is considered to be new sources of sound due
to varying absorption of the environment. This creates patterns which helps him
give a relative information of the distance of where the echo is bouncing from.
Echolocation mostly work on blind people because of their heightened sense of
hearing. Dr. Dye’s results in his research showed that the echo dominated over
the source of the sound if the interval falls between 8-64 ms. Seeing how the
source of the sound is coming from the individual, it would be beneficial for
the echo to be processed over the source since the source is close to our ears.
Therefore, the echo provides relative information of the depths of the
environment and helping the individual to be spatially aware.
References:
"Human Echolocation: Using Tongue-clicks to Navigate the World -
BBC News." BBC News. N.p., n.d. Web. 17 Oct. 2015.
The influence of later-arriving sounds on the ability of listeners to
judge the lateral position of a source. / Dye, Raymond H.; Brown, Christopher
A.; Gallegos, José A.; Yost, William A.; Stellmack, Mark A. In: Journal of the
Acoustical Society of America, Vol. 120, No. 6, 2006, p. 3946-3956.
Image:
http://media1.s-nbcnews.com/j/MSNBC/Components/Photo/_new/090707_ear-echolocation.grid-6x2.jpg