Medindia LOGIN REGISTER
Medindia

Sound Separated into Sound Frequencies

It has long remained a mystery how even in loud environments - like a factory with rattling machines, or a party with music and a cacophony of people talking

It has long remained a mystery how even in loud environments - like a factory with rattling machines, or a party with music and a cacophony of people talking - we can hear the voice of our conversational partner. This showed that the brain chooses what to hear.

Neurophysiologists still do not fully understand how we do this. But they do know that the cochlea is mapped out in certain areas of the brain, and like the retina, this mapping is point-for-point. Thus, in interpreting our environment it is important for the brain to separate sound into its constituent frequencies. That means that particular sound frequencies best activate certain groups of neurons in the auditory cortex. Scientists have used electrophysiological and anatomical studies to determine which areas of the brain are responsible for certain frequencies - but mainly in animals, like those of the macaque monkey.

These kinds of studies are only rarely conducted on humans; therefore, much of our knowledge comes from the work with animals. Neurologists primarily use functional magnetic resonance imaging (fMRI) to see "through" the human brain - including the auditory cortex. Looking at hearing, they saw a pattern of activity comparable to that of monkeys. "But the comparisons were indirect," says Christopher Petkov, who led research at the Max Planck Institute. Until now, there had not been any fMRI conducted on the auditory cortex of monkeys, for comparison. "We have now closed that gap," he explains. Scientists have compared the results that various methods produce on the auditory cortex of macaques and these can now be linked to the human imaging using the same technique. They can also now investigate more thoroughly to what extent a monkey’s neuronal auditory centres resemble - and differ from - humans’. This will help advance research into how the primate brain separates sound mixtures in our typically noisy listening environment.

In the new fMRI study, scientists went beyond identifying individual auditory cortex fields (ACFs); earlier studies had predicted those findings. The researchers also created frequency maps for most of these fields. At first, they mapped several ACFs, then a total of eleven, organised like a mosaic on the surface of the brain. They observed a periodic pattern: a topographic preference for certain frequencies, that either increases or decreases as one progresses across a field. In certain neighbouring fields, the frequency develops in the exact opposite way revealing many mirror reversals of the mosaic pattern. Each sound frequency can thus be found in each ACF. Petkov explains that "in the context of such similar organization for so many fields, certainly different fields have different tasks, but we are only beginning to understand what those differences are."

The researchers have, however, divided the ACFs into two groups already, using hints from electrophysiological work in these primates. Each ACF is responsible for a different sound signal. Three of these fields, which together create a kind of "core" for the auditory cortex, react to individual frequencies in simple sounds like tones. The other eight - including newly described ones - respond better to sounds that are a mixture of different frequencies, like many of the sounds in our environment. These ACFs enclose the three core fields like a belt, and seem to be eight in number.

The pitch pattern in each individual ACF was not as differentiated as, for example, on a piano keyboard. The organisation of the topography could be best observed when sounds lay four octaves apart from one another. Petkov explains that "this is due to the conditions necessary for the imaging technique." In order to see clear signals at all with fMRI, the scientists presented tones that were louder than the soft test tones that are commonly used in electrophysiological studies. "Larger and larger areas of the auditory cortex become active when we do this, but our challenge was to preserve the broad topography by not presenting sounds too loudly," Petkov explains. This was an interesting observation for the Max Planck researchers because noise affects the auditory cortex, leading to hearing loss, which also probably disrupts such organised patterns of the brain. Now that many of these fields can be functionally identified, studies can focus on how the responses of these fields are changed by hearing loss and how to restore the functionality of these regions.

Source: Eurekalert


Advertisement