Brainwaves Across Continents: Inclusive AI-Driven Telematic Music at ICMC 2025
- michaelculture

- Dec 19, 2025
- 2 min read
In May 2025, in preparation for the demo presentation at the International Computer Music Conference (ICMC) 2025 in Boston in June, ShareMusic & Performing Arts reached a new milestone in the exploration of multisensory and participant-centred creativity. In an intercontinental workshop session linking musicians in Brazil, Sweden, and the UK, Somax – IRCAM’s environment for AI-driven co-creative musical performance – was connected to JackTrip, a tool developed at Stanford's Center for Computer Research in Music and Acoustics (CCRMA) for ultra-low-latency, high-quality audio streaming over the internet.
At the same time, music was generated directly from biosensors: in this case, an electroencephalogram (EEG), which detects the activity of the brain.
This performance was part of the MuseIT project, in which ShareMusic & Performing Arts is a partner institution. It marked a significant step forward in integrating computational neuroscience into telematic music-making. By allowing physiological signals and emotions, in particular, real-time emotional responses, to shape AI-driven interaction, the workshop expanded the possibilities for inclusive collaboration. Musicians were able to perform not only through their instruments, including electronic interfaces, but also simply through their brain activity, treating it as a new form of musical expression.
At the centre of this excerpt were three performers: frequent Share Music collaborator Ewe Larsson, who performed electronic sounds via an interface; flutist Cássia Carrascoza, whose acoustic performance added timbral richness; and John Turner, a researcher with X-system, wearing the EEG device, and thus producing music via SOMAX entirely with his brain activity. In this session, Turner’s brainwave activity was first sonified using the X-system tools, and then fed into the SOMAX system to produce musical phrases via a pre-trained set of saxophone sounds. Together, they demonstrated how the diversity of performance modalities — playing on an instrument, playing an interface, and participating simply through listening and thinking — could be brought together through the MuseIT platform to produce a responsive musical dialogue.
Behind the scenes, the workshop benefited from close collaboration between performers, researchers and technologists, informed by a series of participative workshops, ensuring the responsiveness of the system and the successful integration of biosensor data into networked performance. The result was an engaging accessible performance that revealed the potential of multisensory co-creation across continents.
Looking ahead, the community of performers and researchers collaborating with ShareMusic aim to continue expanding this approach by incorporating additional forms of biosensing and refining inclusive multisensory emotion-driven interaction, in particular through visual and haptic (or tactile) representations of emotion. With the future integration of X-system’s tools for EEG-based musical selection – i.e., choosing music from a database according to its emotional characteristics – performers will gain greater ability to shape performances through their emotions, broadening the expressive palette available to artists of all abilities.
The demo presented at ICMC 2025 points towards a future where inclusive online music-making is enriched not only by network technologies and AI, but also by the intimate, embodied signals of performers themselves. As the platform for online co-creativity developed as part of the MuseIT project continues to evolve, ShareMusic remains committed to employing these innovative tools to foster creativity, accessibility, and expressivity across diverse communities of practice, with Disabled musicians at the center of these activities.








Comments