Science —

Musicians hear songs when they read music, non-musicians seek visual patterns

In some cases, brain training actually changes our brains.

Musicians hear songs when they read music, non-musicians seek visual patterns

A new study published in PNAS uses brain scans of musicians and non-musicians to demonstrate that humans undergo what's called training-related neuroplasticity: training in music fundamentally changes our brains.

Recent advances in neuroscience have allowed scientists to examine what’s termed multi-sensory integration. Specific networks of neurons have been linked to senses like vision and hearing. Multi-sensory integration involves making sense out of input from several of these systems. It’s required for humans to interact with and interpret their surroundings.

For a musician, reading music notation is an activity that includes auditory, visual, and motor information. Consequently, it’s a useful activity for studying the interaction of multiple senses. This study examined the cortical network that integrates audiovisual and auditory processing using a technique called magnetoencephalographic recordings (MEG).

MEG is a non-invasive neuroimaging technique that records magnetic fields produced by electrical currents. The brain, like a computer, functions based on the relay of electrical signals, which in turn influence the magnetic environment. MEG allows scientists to map brain activity by imaging the effects produced by neurons as they are activated.

Subjects were asked to read music while their brains were being scanned using MEG. The behavioral results of the study indicated that, as expected, the musician’s training was linked to higher performance in this task.

The MEG analysis allowed the researchers to identify the cortical networks involved in multi-sensory perception. It suggests that multi-sensory integration engages cortical areas that are spread apart rather than being clustered in a dedicated area of the brain.

The data indicates that while reading music, musicians use a part of the brain that is skilled at noticing deviations from an auditory pattern, while non-musicians rely on visual clues. In other words, non-musicians rely on visual processing, while musicians rely on the corresponding auditory information portrayed by the notes on the page.

These findings present evidence that connectivity within the brain is reorganized according to expertise, indicating that humans experience what’s called “training-related neuroplasticity”—our brains are fundamentally changed when we choose to specialize and train in a specific area of expertise.

The study makes some important contributions. It imaged the functional network underpinning audiovisual integration, as well as the effect of musical expertise on reorganizing this network. The MEG data showed that non-musicians rely heavily on processing visual cues when given sheet music.

By contrast, the musicians used a large-scale cortical network including the temporal areas and the left inferior frontal gyrus. This finding indicates that their training fosters the development of this network, forming a dynamic system for integrating information from multiple senses.

So new students learning music or a similar skill can take heart. Through sufficient training, the brain will adapt to these new skills by making adjustments to its very structure.

PNAS, 2015. DOI: 10.1073/pnas.1510662112 (About DOIs).

Channel Ars Technica