The JAM_BOT is the result of a collaboration between the MIT Media Lab and GRAMMY-winning artist Jordan Rudess. It is a real-time AI-powered system that enables collaborative musical improvisation with music language models. To demonstrate this technology, we organized a concert in September 2024, in which Rudess was able to engage in various improvised pieces with custom-trained models that were able to mimic his unique improvisational style. The JAM_BOT can be interacted with through the interface of a MIDI controller, and it generates MIDI data that can then be synthesized. By using MIDI as a representation of symbolic music, the JAM_BOT can accurately model the melodic, harmonic, and rhythmic intricacies of real-time musical inputs and generate convincing outputs on the go. To embed the JAM_BOT on stage, we developed a visualization system that can effectively communicate the generations of the model to a live audience by taking MIDI inputs and creating live animations that anticipate the future notes played by the JAM_BOT.