Functional MR images (fMRI) show areas of heightened activity in the brains of jazz musicians as they improvise and introduce and modify melodies in response to each other's ideas, according to new study published online February 19 in PLOS One.
Researchers from Johns Hopkins University School of Medicine found increased activation in brain areas traditionally associated with spoken language and syntax. However, the musical improvisation lessened activity in areas that process the meaning of spoken language (i.e., semantics).
The results suggest that brain regions that process syntax -- or the structure of words and phrases in sentences -- are not limited to spoken language. Rather, the brain uses the syntactic areas to process communication in general, whether through language or music, noted senior author Dr. Charles Limb, an associate professor in the department of otolaryngology/head and neck surgery. Limb is also a musician and holds a faculty appointment at the Peabody Conservatory.
"These findings support the hypothesis that musical discourse engages language areas of the brain specialized for processing of syntax, but in a manner that is not contingent upon semantic processing," wrote lead author Gabriel Donnay and colleagues. "Therefore, we argue that neural regions for syntactic processing are not domain-specific for language, but instead may be domain-general for communication" (PLOS One, February 19, 2014).
Language vs. music
Until now, studies of how the brain processes auditory communication between two people have been performed only in the context of spoken language, according to Limb. The combination of musical improvisation and fMRI "provides a means of investigating the neurobiology of interactive musical communication as it occurs outside of spoken language," the authors wrote.
The study included 11 right-handed, healthy male musicians with a mean age of 38.8 years (range, 25 to 58 years). They were all professional musicians and highly proficient in jazz piano. None of the subjects had a history of neurologic, auditory, or psychiatric disorders.
Blood oxygen level-dependent (BOLD) imaging data were acquired using a 3-tesla whole-body MRI scanner with a standard head coil and a gradient-echo echo-planar imaging (EPI) sequence.
During each 10-minute session of improvisation, known as "trading fours," one musician lied on his back inside the MRI machine with a specially designed plastic piano keyboard resting on his lap and his legs elevated with a cushion. While in the MRI system, the musician could use two mirrors to view where his fingers were on the keyboard.
The participants were asked to use only their right hand during scanning, and they were monitored visually to ensure they did not move their head, body, or other extremities during a performance.
During the session, a second musician played a keyboard in a control room. The two musicians were able to hear each other's performance, along with a prerecorded rhythm section accompaniment over their headphones.
Musical tasks
The researchers used two scenarios to measure brain activity: The first scenario (Scale) assessed brain activity during a minimally complex musical task, while the second scenario (Jazz) examined a more complex musical interaction between the pair.
In the Scale scenario, musicians were assigned two tasks. During the control task (Scale-Control), the two musicians alternated playing a musical scale in quarter notes with their right hand. For the interactive task (Scale-Improv), the musicians took turns improvising four musical measures (trading fours). The musicians were asked to listen and react to each other's musical improvisations.
There were also two tasks in the Jazz scenario. In the first task (Jazz-Control), the two musicians alternated playing four-measure segments of a jazz composition (written by two of the study authors) that they memorized before scanning. For the interactive task (Jazz-Improv), both men traded fours; this time, improvisation was unrestricted melodically and rhythmically, whereas it was more restricted in the Scale tasks.
'Intense engagement'
In reviewing the fMRI results, the group found that the improvised communication between the two musicians resulted in an "intense engagement" of the brain's left hemispheric cortical areas, which are associated with language. The improvisation activated the left inferior frontal gyrus and left posterior superior temporal gyrus, which are linked to syntactic processing for language.
However, fMRI also showed deactivation during improvisation in the parietal lobe's angular gyrus and supramarginal gyrus, which are involved in the semantic processing of language.
"The correlation between deactivation of the angular gyrus and improvisation may be indicative of the lesser role semantic processing has in moment-to-moment recall and improvisatory musical generation, whereby only musical syntactic information is exchanged and explicit meaning is intangible and possibly superfluous," they wrote.
The results "provide important insights into the neural overlap between music and language processing and support the view that these systems rely in part on a common network of prefrontal and temporal cortical processing areas," Donnay and colleagues added.
The study also offers evidence that the brain's parietal cortex structures, which are associated with semantic processing for language, "are not involved in spontaneous musical exchange, suggesting a fundamental difference between how meaning is conveyed in music and language," they concluded.