The placid chords of a Debussy prelude splashed through a darkened auditorium during a recital by the pianist Nicolas Namoradze at the University of California, San Francisco, on a November evening.
A translucent image of Namoradze’s brain appeared above him on a screen: Electrical currents of different wavelengths, associated with varying levels of alertness, registered as colorful activity coursing through the model like storm fronts on a weather map. With each chord, clouds of green and blue bloomed, then faded as the sound receded. As the recital progressed with works by Bach, Beethoven and Scriabin, the image of the gently rotating brain showed a complex choreography of signals that sometimes ping-ponged between different areas or flickered simultaneously across the organ’s hemispheres.
As a visual spectacle accompanying Namoradze’s pellucid playing, it was mesmerizing: an X-ray, seemingly, of virtuosity at work.
But to the scientists in the audience, attendees at a conference on the neuroscience of music and dance, it was more than entertainment. It was evidence of a breakthrough in experiment design — one that opens up possibilities in an area that has long eluded scientific study: how music activates the brain, not in listeners, but in performers. It was also a reminder of the value artists can bring to scientific inquiry as active participants shaping studies of their craft.
The neuroscientist Theodore Zanto, a member of the Neuroscape lab at U.C.S.F. that created the “Glass Brain” animations, said in an interview the next day that he was surprised — and moved — by the result. “It’s probably the cleanest real-time representation of what’s happening inside the brain during a piano performance,” he said.
Scientists have long been drawn to music as a window onto the brain because it compresses so many human capacities into a single activity. It involves at once perception, movement, memory, attention and emotion. It unfolds over time. And it requires constant prediction and adjustment. Rhythm, in particular, has become a focus of research because of its links to language development, motor coordination and brain health.
Yet one central question has remained out of reach: What happens in a performer’s brain while playing? Traditional brain-imaging tools like functional m.r.i. (f m.r.i.) require subjects to lie motionless in a scanner. Newer wearable technologies, including EEG (electroencephalography) caps fitted with electrodes, make it possible to study musicians in more natural settings. But to capture meaningful data requires dozens of repetitions of the same performance, aligned in time to the millisecond.
This is where Namoradze, 33, an award-winning pianist who obtained a degree in neuropsychology during the pandemic years, contributed a breakthrough in methodology. The result is imaging of startling clarity, as well as a host of new questions for scientists to consider.
Namoradze is not the first musician to turn to the Glass Brain, which was developed to monitor and steer certain cognitive functions in closed-loop video games. The drummer Mickey Hart of the Grateful Dead has used it to improvise live in settings like the Sphere in Las Vegas and the Hayden Planetarium in New York. Namoradze contacted the Neuroscape team hoping to record videos visualizing how his brain activity changed as he played works with different moods and structures.
Addressing a conference of researchers the day of his recital in San Francisco, Namoradze recalled the first, sobering reaction from Zanto. “This is a kind of pop-science project,” Namoradze said Zanto told him. “Fun, but not real research.”
From an experimental standpoint, the difficulty would not be in rendering the brain activity of a pianist in action. Rather, obtaining data that would support real research meant separating processes related to the music making from the jumble of electrical currents caused by other things, like digestion. To extract the signal from the noise, researchers would have to capture multiple EEG readings of Namoradze playing the same piece so that meaningful neural signals could emerge. And those measurements would have to have near-perfection alignment across time.
Andrea Protzner, a neuroscientist at the University of Calgary who eventually collected Namoradze’s data, said in a phone interview that the required precision is the reason EEG studies have so far centered on music listening in the lab. “EEG is precise to the millisecond,” she said. Only that level of precision allows event-related signals — in this case, activity related to the music — to stand out clearly from the noise of accidental muscle movements and other stimuli.
“That’s easy with listeners, who can hear the same recording over and over,” she said. “With a performer, it’s incredibly hard.”
At first, Namoradze considered studying his brain while listening to a recording of himself, or while visualizing himself performing. But then he alighted on a solution to the reproducibility problem that was at his fingertips.
A Steinway Spirio player piano can capture every detail of a performance and reproduce it, keystroke for keystroke. In Protzner’s lab, Namoradze donned an EEG cap as he recorded his program and then played through it multiple times, essentially “finger-syncing” to the piano’s playback of his recording.
“By the end I would actually forget I wasn’t playing,” Namoradze said. “My muscles were doing the same thing; I was hearing the same thing. I was able to embody my own ghost.”
Namoradze’s project is part of a deepening partnership between scientists and musicians to study performance in conditions closer to real life; artists help formulate the questions and often design the study as educational theater. In one example, the composer Anthony Brandt of Rice University collaborated with José Luis Contreras-Vidal, a neuroscientist at the University of Houston, to bring EEG imaging onto the concert stage, outfitting a pianist and a conductor with brain sensors during a live performance. The results revealed both synchrony and divergence between their brains — patterns that matched their distinct musical roles.
“The planning center of the conductor was working independently of the pianist,” Brandt said, while the pianist’s motor regions followed a different timetable.
Performance-based experiments like this pose challenges for scientific control, Brandt acknowledged, but he argues that the alternative can be an anemic shadow of the real-life practice: Pianists asked to improvise on small plastic keyboards inside an f m.r.i, for example, or improvisation studies in which participants tap out rhythms for 15 seconds at a time. “Scientists often want the music to be well behaved,” he said. “But music isn’t designed to be tame. It’s meant to be a representation of human expression in all its full-bodied glory.”
Namoradze conceived his neurorecital as a concert lecture pairing live performance with brain visualizations generated from his laboratory sessions in Calgary. During the recital in San Francisco, he paused the video to describe what he believed he was seeing: gentle showers of activity in Debussy, complex coordination across different regions in Bach, bursts of motion between planning and execution in Beethoven. In the music of Scriabin, he pointed to a noticeable increase in activity in the occipital lobe where vision is processed. Could it be, he wondered, that the composer’s synesthesia, which made him associate sounds with colors, had somehow become encoded in the sonata?
Many of Namoradze’s thoughts remain conjecture pending further research. But the questions he asks arise from an intimate involvement with the music. For scientists, they are prompts. “He is quite literally generating hypotheses for us,” Zanto said after the recital.
“Do I think there was synesthesia involved?” asked Protzner. “No. But was it more associated with colors for him than the other pieces? Definitely.”
Distinguishing interpretation from evidence, researchers say, will require comparative studies — of other pianists playing the same repertoire, or of listeners’ brains measured alongside performers’. As ingenious as Namoradze’s Spirio hack is, it may not work for every player. Most amateurs would struggle to maintain his degree of accuracy, even finger-syncing to a player piano. And many concert pianists are far too physically animated to take part in EEG studies in which the data is contaminated by the slightest head movement. (“Everybody was shocked at how still he played,” Protzner said recalling Namoradze’s hours performing in her lab.)
But as Zanto and his team prepare to sift through the data generated by the neurorecital and begin to write the study up for scientific publication, Namoradze is already setting his sights on the bigger picture. There’s talk at Neuroscape of building a “Glass Body” using measurements of some dozen physiological parameters — including heart rate, skin conductivity and digestion — to create animated models of the swirling activity inside a particular human across time. Parsing the vast data sets such modeling requires is still a bit of a dream, Zanto said.
But once that technology exists Namoradze will be ready to return to the lab. He hopes, he said, that a Glass Body would make visible the conversation among brain, hands and feet that turns musical intention into motion.
The post What Happens in a Performer’s Brain While Playing Music? appeared first on New York Times.




