Sounds and images share a similar neural code in the human brain, according to a new Canadian study. In the online edition of the Proceedings of the National Academy of Sciences (PNAS), scientists from the Université de Montréal and the Montreal Neurological Institute at McGill University explain how the same neural code in the brain allows people to distinguish between different types of sounds, such as speech and music, or different images.
Participants were recruited to undergo functional magnetic resonance imaging (FMRI), a non-invasive form of brain mapping used to determine how the brain recognizes different characteristics in musical instruments, words from conversations or environmental sounds. Subjects underwent an exhaustive three hours of FMRI exams to provide precise information about how the brain reacts when different sounds are played.
“It turns out that the brain uses the same strategy to encode sounds than it uses to encode different images,” explains lead author Marc Schönwiesner, a Université de Montréal psychology professor.
“This may make it easier for people to combine sounds and images that belong to the same object, such as the dribbling of a basketball.”
The next step for the researchers is to determine exactly how the brain distinguishes between rock drum beats to the strings of a symphony or from a French conversation to an English one. “Our goal is to disentangle exactly how the brain extracts these different types of sounds. This is a step may eventually let us reconstruct a song that a person has heard from according to the activity pattern in their brain,” explains Dr. Schönwiesner, who is also a member of the International Laboratory for Brain, Music and Sound Research (BRAMS), a joint Université de Montréal and McGill University think-tank on music and the mind.
As scientists advance in decoding brain activation patterns, says Dr. Schönwiesner, mind-boggling applications can be envisaged. “If researchers can reconstruct a song a person has heard according to an fMRI reading, we're not far off to being able to record brain patterns during sleep and reconstruct dreams,” he predicts. “That would be really cool, although this possibility is decades of research away.”
About the study:
The article “Spectro-temporal modulation transfer function of single voxels in the human auditory cortex measured with high-resolution fMRI,” published in the of the Proceedings of the National Academy of Sciences, was authored by Marc Schönwiesner of the Université de Montréal and Robert Zatorre of the Montreal Neurological Institute at McGill University.
Partners in research:
This study was funded by the Canadian Institutes of Health Research and the German Academy of Sciences.
On the Web:
• Proceedings of the National Academy of Sciences journal
• Université de Montréal
• Montreal Neurological Institute and Hospital
• McGill University
For more information, please contact:
International press attaché
Université de Montréal