Skip to main content

The JAM_BOT

Categories:

Submitted By: Perry Naseck & Lancelot Blanchard

Elevator Pitch

The JAM_BOT

Product Description

The JAM_BOT

The JAM_BOT is the result of a collaboration between the MIT Media Lab and GRAMMY-winning artist Jordan Rudess. It is a real-time AI-powered system that enables collaborative musical improvisation with music language models. To demonstrate this technology, we organized a concert in September 2024, in which Rudess was able to engage in various improvised pieces with custom-trained models that were able to mimic his unique improvisational style. The JAM_BOT can be interacted with through the interface of a MIDI controller, and it generates MIDI data that can then be synthesized. By using MIDI as a representation of symbolic music, the JAM_BOT can accurately model the melodic, harmonic, and rhythmic intricacies of real-time musical inputs and generate convincing outputs on the go. To embed the JAM_BOT on stage, we developed a visualization system that can effectively communicate the generations of the model to a live audience by taking MIDI inputs and creating live animations that anticipate the future notes played by the JAM_BOT.


How It’s Innovative

To the best of our knowledge, there has been no real-time system enabling the use of generative AI models in live music performances at the level at which the JAM_BOT operates. To be successful, Jordan Rudess wanted the JAM_BOT to be able to properly understand his musical input and to generate outputs that would be both coherent and interesting, and keep the improvisation going.

See MIDI Innovation In Action

Most Inspiring Use Cases

We believe that our technology can revolutionize live music performances by integrating generative AI in a way that can be both engaging for artists and entertaining for audiences. We are especially interested in finding novel applications and filling the “negative space” that exists in the domain of live music. For example, future iterations of the jam_bot will enable a performer to improvise with a symphony orchestra, which will respond in real-time to the performer’s musical gestures–an endeavor hardly possible when improvising with multiple human performers.

Expansion Plans

Jordan Rudess will be a Distinguished Visiting Artist at the MIT Media Lab beginning September 2025. Through this renewed collaboration, we will explore how the jam_bot can generate expressive MIDI data (including velocity, pitch bends, and diverse continuous controls) as well as better stylistic controls. We will also explore more dynamic and intelligent visualizations for musical generations.

Commercialization

This project is an ongoing academic research work with no direct plans for commercialization. However, we believe that the project could be commercialized in various forms, such as: a solution for AI-augmented live music performances, an AI plug-in integrated in keyboards for improvisation practice, a software for learning and practicing ear training and improvisation, and plug-ins for composition. We submitted a provisional patent for the technology.