Locating sound is not so simple by David Welsh
It appears the human brain adopts different strategies for locating sounds, according to their frequency, and not the single method based on the different time of arrival at each ear that has previously been thought to explain this ability. Humans behave as small mammals when tracing the source of a low-pitched sound but behave as birds in the case of higher frequencies, according to a study at University College London (UCL).
The researchers have devised a new model for how the human brain tracks sound which could eventually help engineers develop technology for tracking sound sources in noisy environments and help restore "spatial hearing" in deaf people with cochlear implants.
The study, published recently in the journal Nature, was funded by the UK's Medical Research Council. The UCL's Dr David McAlpine and Nicol Harper asked volunteers to wander the streets of London wearing microphones in their ears. The microphones measured the time difference between sound arriving at each ear for a range of noises that people typically encounter in the city.
Although it was already known that animals and humans use small differences in the arrival time of sound at each ear to locate its source, the UCL study found that the human brain adopts a strategy similar to a barn owl's brain for sound pitches above middle C (about 278Hz) and a gerbil's for those below.
Dr McAlpine said: "For animals and humans, locating the source of a sound can mean the difference between life and death, such as escaping a pursuer or crossing a busy street. Our study suggests that the brain adopts an efficient strategy for doing this, adapting to different frequencies, or pitches, of sound.
"Knowing how the brain creates a sense of sound space is the first step to recreating spatial hearing in the deaf. Recent advances in cochlear implants allow people to have implants in both ears, with the potential to restore spatial hearing."
For more than 50 years a single model has been used to explain how brain cells represent the time difference between the ears. This "classic" model assumes that specific brain cells are allocated to specific time differences, where the relevant cells fire depending on which direction a sound is coming from. The human brain can detect differences in the arrival time of a sound at the two ears of about 10 millionths of a second. This is 100 times shorter than the electrical impulses that transmit information within the brain.
Because different animals need to detect sounds relevant to their own environment, their brain cells shift their tuning until they code most accurately for sounds the animal is likely to encounter. Recordings from the brain of barn owls - a species that hunts at night using only sound - appear to confirm this. But the classic model could not account for recent evidence that the brain cells of small mammals appear to respond most to time differences that the animal is never likely to hear.
The new sound location model developed by Nicol Harper in Dr McAlpine's lab explains this anomaly. Small mammals such as gerbils or guinea pigs can follow low-pitched sounds.
Surprisingly, to enhance this ability at low frequencies, the brain cells organise to respond most to time differences outside the range the animal naturally encounters.
This strategy does not suit higher frequencies, such as higher-pitched sounds. Thus, barn owls' brains follow the classic model of brain cells firing most for time differences within the animal's range. Human brains appear to "pick and mix" from the different strategies, depending on sound frequency.
Dr McAlpine hopes the findings will help engineers to develop technology to a similar standard to that of human brain. Current sound tracking devices work well in quiet places but suffer considerably in the sort of noisy environments in which humans have little trouble in following a conversation.
London Press Service