Researchers say that they have identified brain regions where speech sounds are perceived as having abstract meaning, rather than as just a stream of sensory input.
University of Chicago researchers say that they have identified brain regions where speech sounds are perceived as having abstract meaning, rather than as just a stream of sensory input.
Lead researcher Steven Small says that the new findings suggest that the understanding of speech does not just emerge from lower-level processing of speech sounds, but involves a specialized perceptual region.During the study appearing in the journal Neuron, published by Cell Press, the researchers asked volunteer subjects to listen to a series of simple speech sounds while watching video of people pronouncing the sounds. The speech sounds might either match the video representations or not.
The subjects’ brains were scanned using functional magnetic resonance imaging, in which harmless magnetic fields and radio waves image blood flow in brain regions and reflect brain activity in such regions.
The researchers manipulated the sequences of the various combinations of speech sounds and video of the sounds, which enabled them to distinguish brain regions that were active in abstract processing of the speech sounds versus only their sensory properties.
Upon analysis of the results of their experiments, the researchers came to the conclusion that two areas of known left-hemisphere speech-processing regions, namely, pars opercularis and planum polare coded speech at an abstract level.
The researchers said: “We have shown that there are neurophysiological substrates that code properties of an audiovisual utterance at a level of abstraction that corresponds to the speech category that is ‘heard,’ which can be independent of its sensory properties.”
Advertisement
Source-ANI
SRM/P