Skip to main content

Accessible Music Making Takes Center Stage at ABLE Assembly 2026


At Berklee College of Music, educators, artists, researchers, and music technology innovators came together to explore how accessible design can make music participation possible for more people.

The 2026 ABLE Assembly: Arts Better the Lives of Everyone, hosted by the Berklee Institute for Accessible Arts Education, brought together a remarkable community of people working at the intersection of music, education, disability, technology, and creativity.

Gillain Keller and Kate Rounding stand indoors, smiling. One wears a gray blazer and skirt with a red lanyard and name badge; the other wears a black polo shirt with a TIME logo, holding a cup and also wearing a red lanyard.

For The MIDI Association, the event was an important reminder that accessibility is not a separate category of music technology. It is central to the future of music making. When instruments, controllers, software, and educational practices are designed to include more people from the beginning, the entire music ecosystem benefits.

A Weekend Focused on Access, Inclusion, and Creativity

The ABLE Assembly schedule included keynote presentations, workshops, breakout sessions, field trips, networking, synthesis sessions, and a public evening performance. Sessions explored topics including universal design for learning, adaptive music instruction, accessible ensembles, trauma-informed education, inclusive arts programming, adaptive piano education, accessible woodwind instruments, tactile representation, AI-powered learning tools, neurodiversity-affirming arts education, and the role of technology in inclusive music education.

Throughout the weekend, the emphasis was practical and human-centered: how can educators, makers, musicians, and institutions remove barriers so that more people can participate fully in music and the arts?

That question is deeply connected to the work of The MIDI Association’s Music Accessibility Special Interest Group, which is focused on how MIDI and related music technologies can support accessible instruments, adaptive controllers, inclusive learning environments, and new forms of creative expression.

The SPOKE Hackathon: Building Accessible Instruments Through Touch

One of the most hands-on parts of the weekend was the maker space and hackathon activity using SPOKE, a simplified touch-sensing platform for computer interaction and music creation.

SPOKE allows people to create touch-based controllers that can communicate with computers as USB-MIDI devices, keyboard and mouse inputs, or serial devices. The board includes multiple touch inputs and is designed to make it easier for educators, artists, musicians, and makers to build custom interactive projects without needing to start from scratch.

For musicians, SPOKE can be used to build custom MIDI instruments and controllers. Touch inputs can be connected to everyday conductive materials such as metal objects, fruit, conductive fabric, conductive ink, graphite pencil drawings, or other creative surfaces. That makes it a powerful platform for exploring accessible instrument design because the “instrument” can be shaped around the player, rather than forcing the player to adapt to a conventional instrument interface.

A person with long curly hair and a full beard smiles gently at the camera, wearing a mustard yellow sweater. The background is plain and light-colored.

Tim Yates did a brilliant job of leading the Hackathon.

Tim is Research and Innovation Executive at Drake Music, responsible for accessible music technology and instrument development. He is committed to breaking down the barriers to music making faced by Disabled musicians by ensuring that everyone has access to the instruments and technology they need. He is also undertaking a PhD in immersive, interactive sound art-installations using spatial audio at the University of Greenwich. He is a musician, sound-artist, and technologist, and co-founder and co-organiser of Hackoustic.

Why This Matters for MIDI

MIDI has always been about connection. It connects instruments to computers, controllers to software, performers to sound engines, and creators to tools. In accessible music technology, that same idea becomes even more important: MIDI can help connect a person’s unique movement, touch, gesture, or communication style to musical expression.

Touch-based controllers like SPOKE show how MIDI can support flexible and personalized instrument design. A banana, a metal spoon, a hand-drawn graphite pad, a fabric surface, or a custom 3D-printed object can become part of a musical interface. For some musicians, that flexibility is not simply a novelty. It can be the difference between exclusion and participation.

As MIDI 2.0, MIDI-CI, Profiles, Property Exchange, and future accessibility-focused tools continue to develop, events like ABLE Assembly provide essential real-world context. The most important questions are not only technical. They are also educational, artistic, and social:

  • Can a music controller adapt to the person using it?
  • Can educational tools support many different bodies, learning styles, and communication needs?
  • Can accessible instruments be easier to discover, configure, and share?
  • Can standards help reduce barriers rather than create new ones?
  • Can music technology make participation more joyful, expressive, and inclusive?

ABLE Assembly Highlights

Kate, Athan Billias and Gillian stand together smiling, wearing red lanyards and event badges. The woman on the left is in a black shirt, the man in the center is in a black t-shirt and jacket, and the woman on the right is in a gray suit.

The 2026 program reflected the breadth of the accessible arts education community. Saturday sessions included workshops and discussions on culturally responsive music centers, inclusive school ensembles, assistive technology, adaptive music students, accessible programming, allyship, neurodiverse learners, and the role of technology in music education.

Sunday continued with sessions on universal design in rhythm instruction, accessible ukulele instruction, adaptive piano education, accessible music practice, accessible drum lessons, adapted woodwind instruments, inclusive choirs, Asian-led inclusive music ensembles, and AI-powered tools for accessible wind instrument learning.

The event also included dedicated networking time, synthesis sessions, and opportunities for attendees to share ideas emerging from the maker space. That combination of research, teaching practice, technology, performance, and community-building is what makes ABLE Assembly especially valuable.

From Accessible Design to Musical Possibility

The best accessible music technology does more than solve a technical problem. It opens creative possibilities. It allows a student to participate in an ensemble, a performer to develop a personal instrument, a teacher to reach more learners, and a community to rethink what music making can look like.

That is why The MIDI Association is committed to supporting conversations around music accessibility. The future of MIDI is not only about higher resolution, better timing, richer data, or more powerful devices. It is also about making sure that the tools we build can serve a wider and more diverse community of music makers.

Gillian Keeler and Andrew Landsley wearing event lanyards smile and pose together in a room with tables and people. A large screen behind them displays the Berklee logo. The atmosphere is lively, with stage lights and decorations visible.

ABLE Assembly 2026 showed that accessibility is a powerful source of innovation. When educators, artists, researchers, developers, and musicians work together, new kinds of instruments and new kinds of musical experiences become possible.

That spirit is at the heart of MIDI: connection, creativity, interoperability, and the belief that music technology should help more people make music.

Learn More