fbpx
Skip to main content

Android Audio Plugin: MidiDeviceService

Elevator Pitch

Android Audio Plugin: MidiDeviceService

Import existing synth audio plugins to Android platform and use them as virtual MIDI devices

Product Description

Android Audio Plugin: MidiDeviceService

“Android Audio Plugin Framework” (AAP) project that brings in existing various desktop audio plugins to Android. “AAP MidiDeviceService” is a new applied usage of the framework, that exposes those instrument plugins as virtual MIDI devices and has MIDI 2.0 in mind from the ground. Android lacks commonly used Audio Plugin Framework. On Windows and other desktops, VSTs are popular. On Mac and iOS there is AudioUnit. On Linux LV2 is used (and VST3 recently). There is no such thing in Android. AAP is to fill this gap. As a software prototype, I have handful of LV2 and JUCE audio plugins ported to Android and successfully generated some simple MIDI notes as Android MidiDeviceServices in Android MIDI API, played by another virtual MIDI keyboard application. (It is still not really a product-ready stage as there is no way to control audio plugin parameters unless they are internally controlled by MIDI messages.) Since I am designing the entire framework from scratch, I could design what kind of content should go in and out. Support for MIDI 2.0 UMPs is also on the table (I once had it working, but still need to figure out whether it will be activated via MIDI CI interactions, or assign MIDI 2.0 plugin buffer channels etc.).


How It’s Innovative

Android platform simply lacks commonly accepted audio plugin framework (mine is not yet either), so designing and implementing one comes with handful of challenges that most of existing frameworks didn’t face (yet still, iOS AudioUnit and another plugin framework called AudioRoute SDK did). On mobile platforms like iOS and Android, we have to use separate app processes which makes simply impossible for host DAWs to load plugins from other plugin vendors in process. So far AAP makes use of Android binder framework and shared memory audio/MIDI buffers to achieve it (it has to be non-realtime yet). If you are a DAW vendor, you would most likely give up and provide closed in-app plugins. One thing that I believe none of other frameworks tries (and thus what’s really “innovative” in AAP) is how we achieve plugin extension API like VST-MA, or those in LV2 and CLAP – It is fundamental to plugin frameworks like them as well as AAP to achieve framework stability. AAP is especially tricky because we cannot simply provide in-process extensibility e.g. we cannot simply invoke callback function pointers in the plugin or host process mutually. In business kind of aspect, adopting a new audio plugin framework is almost impossible for DAW and plugin developers, especially when the plugin framework is still a moving target, AAP comes with the idea that production stability would come with stable development tools and existing APIs, such as JUCE and LV2 (I believe CLAP audio plugin takes similar approach). Beyond that, exposing the sound as a virtual MIDI device is most likely stable. That’s why I went forward and implemented MIDI device service support. Even if future AAP becomes incompatible at binary messaging level, the existing MidiDeviceServices would still work. (The idea on using VST plugins as virtual MIDI device happened on Windows before, but I’m using the idea to make audio plugin framework adopted.)

See MIDI Innovation In Action

Expansion Plans

Commercialization