The Neural Basis of Human Creativity

UH Professor Jose Luis Contreras-Vidal to Speak on the Nexus of Art and Science at UN AI for Good Global Summit

University of Houston neuroscientist Jose Luis Contreras-Vidal, the pioneer of brain-machine interfaces, will take the stage in Geneva Switzerland on May 31 at the United Nations AI for Good Global Summit. Contreras-Vidal is known globally for controlling wearable exoskeletons for rehabilitation, and for mapping art-evoked brain activity.

He will not, however, simply give a speech.

Contreras-Vidal will bring with him his collaborators at Rice University and Sam Houston State to present “Meeting of Minds,” a blend of artistic performance and scientific experiment that explores human connection through dance and neuroscience as dancers wear EEG skull caps (brain caps) while performing, and the UH team records their brainwaves.

Portrait of Jose Luis Contreras-Vidal in gray suit, looking left of camera, smiling.

University of Houston Neuroscientist Jose Luis Contreras-Vidal

University of Houston Neuroscientist Jose Luis Contreras-Vidal

“We created an elegant, engaging, and aesthetic approach to observing the creative brain in a dynamic state,” said Contreras-Vidal, UH Hugh Roy and Lillie Cranz Cullen Distinguished Professor of electrical and computer engineering and director of both the UH BRAIN Center, and the UH Noninvasive Brain-Machine Interface Systems Lab.

“We created an elegant, engaging, and aesthetic approach to observing the creative brain in a dynamic state.”
Jose Luis Contreras-Vidal

“We focus on reverse engineering the brain and developing new interfaces so that the brain can communicate directly with external devices like robots or exoskeletons or prosthetic devices,” said Contreras-Vidal.

“The arts tell us about pattern recognition, how we associate things, what we remember, what matters to us, what guides our attention,” said Anthony Brandt, professor of composition and theory at Rice’s Shepherd School of Music and artistic director of dance company Musiqa.

Contreras-Vidal received the invitation to speak on emergent brain-computer interfaces at the summit as the result of his pioneering research at the UH BRAIN Center (Building Reliable Advances and Innovations in Neurotechnology) funded by the National Science Foundation as an industry-university cooperative research center.

“The natural extension was to examine what goes on in the brain during expressive movement in a social context,” he said.

Dancer on sage in artistic movek, wearing brain machine interface on their head as brainwaves ar tracked on screen in backfground

Behind this dancer, the audience views a representation of live data, indicating the synchrony between the dancers’ brains: red indicates higher synchrony, blue lower. 

Behind this dancer, the audience views a representation of live data, indicating the synchrony between the dancers’ brains: red indicates higher synchrony, blue lower. 

To do so, in 2022 Contreras-Vidal started working with Brandt and Sam Houston State University choreographers Andy and Dionne Noble of Noble Motion Dance, one of Texas’ premier dance companies, recognized for their intense physicality and unique collaborations.

Together they created Live Wire, a performance so successful in Houston it was reprised at the 2022 International Workshop on the Neural and Social Bases of Creative Movement at Wolf Trap Foundation for the Performing Arts in Virginia.

The project’s second iteration, Diabelli 200, involved a conductor and pianist wearing brain caps while performing Brandt’s piece inspired by Beethoven’s “Diabelli Variations.” Brandt said the performance marked the first time a conductor wore full-scale neuroimaging equipment during a live performance. Maxine Annel Pacheco Ramírez and Aime Aguilar-Herrera, UH graduate students in electrical and computer engineering who will be in Geneva for the summit, recorded the mobile brain imaging data.

Then Shepherd School violinist Nanki Chugh, who performs in “Meeting of Minds,” analyzed the collected data during the Diabelli 200 performance and won first place for graduate oral presentation in neuroscience at the 2024 Emerging Researchers National Conference in STEM. Her investigation of the neural dynamics between conductor and pianist was sponsored by a training fellowship in the UH BRAIN Center’s Neuromotor Skill Advancement for Post-baccalaureates program funded by the National Institutes of Health's Eunice Kennedy Shriver National Institute of Child Health and Human Development.

Multiple dancers move in tandem. Female at front of dance line is wearing brain machine interface cap. Male dancers are behind her making the same movement.

As the dancers perform, behind them an abstraction of brainwaves, turned on their side, is projected.

As the dancers perform, behind them an abstraction of brainwaves, turned on their side, is projected.

She looked at cue points to see what was happening in the brains of the pianist and the conductor, and she could see exactly the part of their brains that were synchronized and the parts of their brains that were working independently,” Brandt said.

“Meeting of Minds” tackles the timely issue of social division. The performance features dancers Lauren Serrano and Tyler Orcutt, who start in conflict and gradually move toward cooperation. The choreography incorporates elements known to trigger neural synchrony such as eye contact, touch and synchronized movement.

Thanks to projections designed by Shepherd School doctoral candidate Badie Khaleghian that decipher live data from the EEG caps, the audience can watch how the dancers’ brains respond throughout the performance.

This performance, Brandt added, represents a significant step toward studying human behavior in natural settings. Traditional brain imaging techniques often restrict movement and require a sterile environment. Mobile brain imaging, as used in this project, paves the way for studying the brain in more dynamic and ecologically relevant contexts.

Dancer in brain machine cap with arm raised on stage with graphic abstract image in backdrop

“Eventually scientists are going to need to be in much more full-fledged humans-being-humans kind of situations,” Brandt said.

“Mobile brain imaging, brain computer interfaces and AI algorithms developed at the BRAIN Center, which allow the investigation of coupled brain activity, dance and music, are part of that pioneering,” Contreras-Vidal said. “The research could lead to the development of personalized art prescriptions based on music-based interventions that neuromodulate brain activity to improve health and wellbeing.”

The May 31 performance of “Meeting of Minds” will be streamed live on the AI For Good Conference website. You’ll have to get up early to catch it – it begins at 2 a.m. Central time.

Two dandcers in brain machine caps center stage with bright abstract screen behind them. String quartet plays music stage left.

The brain synchrony meter is shown in “Meeting of Minds"

The brain synchrony meter is shown in “Meeting of Minds"