fbpx
Skip to main content
ADC Talk in the main hall

Audio Developers Conference 2024

Announcing Audio Developers Conference 2024

The 10th Audio Developer Conference (ADC) returns in 2024 both in-person and online in a new city, Bristol, UK.

The in-person and online hybrid conference will take place 11-13 November 2024.

Talks at ADC range from audio research, to professional practices, to standards in audio development, as well talks about application areas and career development. Experimental projects are welcome. 

ADC Talk about AudioKit v5

Topics include, but are not limited, to:

  • Digital Signal Processing
  • Audio synthesis and analysis
  • Music technology, DAWs, audio plug-ins
  • Game audio
  • 3D and VR/AR audio
  • Creative coding
  • Other applications of audio programming (e.g. telecommunications, multimedia, medicine, biology)
  • Design and evaluation of audio software and hardware systems
  • Programming languages used for audio development (e.g. C++, Rust, Python)
  • Software development tools, techniques, and processes
  • Performance, optimisation, and parallelisation
  • Audio development on mobile platforms
  • Embedded, Linux, and bare metal audio programming
  • Low-latency and real-time programming
  • Best practices in audio programming
  • Testing and QA
  • Educational approaches and tools for audio and DSP programming
  • Planning and navigating a career as an audio developer
  • Other relevant topics likely to be of interest to the ADC audience

The MIDI Association and ADC

The MIDI Association is a community sponsor of Audio Developers Conference and several MIDI Association companies are paid sponsors of the event.

ADC is run by PACE Anti-Piracy Inc., an industry leader in the development of robust protection products, and flexible licensing management solutions. PACE acquired JUCE a few years ago.

JUCE is the most widely used framework for audio application and plug-in development. It is an open source C++ codebase that can be used to create standalone software on Windows, macOS, Linux, iOS and Android, as well as VST, VST3, AU, AUv3, AAX and LV2 plug-ins.

Several MIDI Association companies are sponsors of Audio Developers Conference including Gold Sponsors Juce and Focusrite, Silver sponsor Avid, and Bronze sponsor Steinberg.

Franz Detro from Native Instruments will be doing a talk introducing ni-midi2, a modern C++ library implementing MIDI2 UMP 1.1 and MIDI CI 1.2 and also the open source software available on MIDI2.dev.

Description of Franz Detro's talk introducing NI-MIDI2 at ADC

Franz is on The MIDI Association Technical Standards Board and an active participant in the MIDI 2.0 Working Group, the DAW working group which is working on open source software for how plugin formats including Apple Audio Units, Avid Audio Extension, Bitwig CLever Audio Plug-in, and Steinberg VST3.

He will be available at ADC to answer questions about MIDI 2.0 and The MIDI Association. The Interactive Audio Special Interest Group (Tom Poole from JUCE is on the IASIG steering committee) will hold an online meeting during ADC.

The MIDI Association will have both in person and on line representation at ADC and ADCx Gather.

The ADCx Gather one-day online event is free and open to everyone in the audio developer community (registration required). ADCx Gather is hosted in Gather.Town, a browser-based virtual online platform that allows attendees to interact and collaborate in real-time.

ADCx Gather takes place on the 1st of November starting at Noon UTC.

Registration for this event has not yet started. Sign-up for the ADC  newsletter and be the first to know when ADCx Gather attendee registrations begin.

Tickets for ADC

There are different tickets available for ADC with in person tickets for corporate, individual, academic and also the same categories for online participation.

Building a USB MIDI 2.0 Device – Part 3

This is the third part in this series on how developers go about building a USB MIDI 2.0 Device. Please read Part 1 and Part 2 before reading Part 3.

In Part 2 we looked at handling Function Blocks and Group Terminal Blocks, UMP Discovery. 

In this third guide we will cover:

  • Using a USB Descriptor Builder to avoid mistakes
  • Interfaces and Interface Association Descriptors in USB
  • Why having separate MIDI 1.0 mode may be useful
  • OS Caching and Firmware updates that affect your USB Descriptors

Using a USB Descriptor Builder to avoid mistakes

Creating correct USB Descriptors can be difficult to get right. Issues can occur such as

  • Incorrect lengths of descriptors
  • Difficulties when creating USB Endpoint with multi MIDI 1.0 Ports
  • Confusion between USB Endpoint In/Out to Embedded Out/IN to External In/Out Descriptors
  • IAD creation

To make this much easier, members of the MIDI Association and MIDI2.dev have created a MIDI USB Descriptor Builder. It allows you to create the Descriptors needed to create a MIDI 1.0/2.0 USB Device. Although the Descriptor Builder is catered to output descriptor for use with tinyUSB tusb_ump driver (https://github.com/midi2-dev/tusb_ump) or the Linux Gadget interface, the generated code should be easily adaptable to other USB stack implementations.

The USB Descriptor Builder is located at https://midi2-dev.github.io/usbMIDI2DescriptorBuilder/ 

The GitHub repository is located here

Parts 1 and 2 of this article include several examples of descriptors. Those examples are based on the default settings of the Descriptor Builder. The USB Descriptor sets many parameters and properties used by the device. It is also recommended that you consider the association of these parameters in relation to other UMP features, selecting values that are consistently formatted. In developing your USB device, be sure to reference the Property Names and Mapping article (https://midi.org/midi-2-0-device-design-property-names-and-mapping).

Interfaces and Interface Association Descriptors (IAD)

If a device supports both USB MIDI and USB Audio, the device should be configured as a Composite device with Audio Interfaces separated from MIDI Interfaces. This mechanism is particularly useful in devices which combine Audio Class version 2, 3, or 4 with a MIDI Class function. This is because the MIDI Class Function, defined as a subclass of Audio Class, has an Audio Control Interface.

An IAD is used to show the relationship between the Audio Interfaces, particularly to associate an Audio Control Interface with an Audio Streaming Interface(s).

A separate IAD is used to show the relationship between the Interfaces for the MIDI Function. This IAD associates a 2nd Audio Control Interface with a MIDI Streaming Interface.

USB Hosts may use the IADs to better determine which drivers will be used to support the device.

Two Alternate Settings: To Enable Backward Compatibility

The USB Device Class Definition for MIDI Devices, Version 2.0 defines how the use of 2 Alternate Settings on the Class Specific MIDI Streaming Interface provides the widest compatibility of Devices and Hosts.

Using this mechanism, a Device can operate as a MIDI 1.0 Device when necessary, but can switch to operating as a MIDI 2.0 Device on any Host which can support MIDI 2.0.

Alternate Setting 0 of the MIDI Streaming Interface is compliant with USB Device Class Specification for MIDI Devices Version 1.0.

Alternate Setting 1 of the MIDI Streaming Interface has descriptors for the MIDI 2.0 function.

The bcdMSC field in Alternate Setting 0 is set to 0x0100. (MIDI 1.0)

The bcdMSC field in Alternate Setting 1 is set to 0x0200. (MIDI 2.0)

USB MIDI Hosts which do not support USB MIDI 2.0 use the descriptors found on Alternate Setting 0 and install the Device as a MIDI 1.0 Device.

USB MIDI Hosts capable of supporting a USB MIDI 2.0 Device instead start by looking for an Alternate Setting 1 with bcdMSC set to 0x0200. If such an Alternate Setting is found in the device, the Host installs the Device as a MIDI 2.0 Device using those descriptors. If the Host does not find the Alternate Setting 1, then the Host will use the descriptors found on Alternate Setting 0 and install the Device as a MIDI 1.0 Device.

Why having separate MIDI 1.0 USB Only mode may be useful in some cases at this time

This may be done to be more compatible with and avoid unexpected behavior when used with some early beta versions of Host USB MIDI 2.0 drivers in 2023 and 2024. 

Initial USB MIDI 2.0 Devices, such as the Native Instruments Komplete Kontrol MK3 and the Roland A-88 MK2, store 2 sets of USB Descriptors :

  • Legacy MIDI 1.0 only
  • USB MIDI 2.0 (with alt 0 = MIDI 1.0, alt 1 = MIDI 2.0)

The user decides which to enable on the device. On the A-88MK2 this is a button combination on start up, on the Kontrol it is a menu option (which causes a reboot).

It is expected that with a short timeframe this will not be a concern and this suggestion will change. Ultimately, all Devices should simply have a single mode of operation which has an Alternate Setting 0 for MIDI 1.0 and an Alternate Setting 1 for MIDI 2.0.

OS Caching and Firmware updates that affect your USB Descriptors

When Devices connect to a Host, an OS may cache the USB Descriptors. For example CoreMIDI uses caching to build a database of MIDI Devices and to handle reconnects.

A Host might not be able to recognize when a device makes these kinds of changes::

  • USB Descriptors including string name changes
  • Group Terminal Block updates
  • Switching between a USB MIDI 1.0 only mode and USB MIDI 2.0 mode

If the Host OS does not recognize that a change has occurred, the user may experience errors and unknown behaviors. 

One method to avoid errors, the user needs to clear the cache and remove the prior device entry. However, this is not always possible and explaining the steps to users can be a challenge. This can be confusing and may still result in further technical support issues.

If the device can change the iSerial string value or the PID every time these notable configuration changes occur, then the OS will see a new device with a new set of features.

The End…

This set of articles would not have been possible without the input and support of the MIDI Association OSAPI Working Group members who provided much of the content.

Writing this series has helped me to clarify and organize my own understanding of  USB MIDI. The MIDI Association intends to continue publishing these kinds of articles to support developers.

Thank you to everyone else who provided information, asked probing questions, and read this series.

MIDI 2.0 Device Design: Property Names and Mapping

MIDI Association OS API Working Group

MIDI 2.0 defines a number of device properties which are declared across 4 layers:

  1. Standard MIDI 2.0 Messages
  2. MIDI-CI Transactions
  3. Property Exchange Resources
  4. USB MIDI 2.0

A single device might declare values for any of these device properties in all 4 layers. This article and its tables show where there are properties in each layer which should have the same value as properties in other layers. Keeping these values the same across the 4 layers will help other devices, operating systems, and applications to recognize and display the device to the user in a predictable and consistent manner.

The properties are shown in 3 tables below. The tables are also available as 3 tabs in a Google Sheets spreadsheet document accessible and downloadable here: https://docs.google.com/spreadsheets/d/1tVJN835Gk_9CVlF6nRLe5uu962vvl82GCzGtzsZbuwI/

Table 1: Basic Device Properties

The properties in this table are generally used for product identification.

Property Name

UMP Message

UMP Message Field

MIDI-CI Message

MIDI-CI Message Field

PE Resource

Property

USB MIDI 2.0 Descriptor

USB MIDI 2.0 Descriptor Field
Vendor ID Device Descriptor idVendor
Product ID idProduct
Interface Name Standard MIDI Streaming Interface Descriptor iInterface + Separate String Descriptor1
UMP Endpoint Name2 Endpoint Name Notification UMP Endpoint Name Standard MIDI Streaming Interface Descriptor iInterface + Separate String Descriptor
Manufacturer Name DeviceInfo manufacturer Device Descriptor iManufacturer + Separate String Descriptor
Product Name family + model iProduct + Separate String Descriptor
Unique Identifier3 Product Instance Id Notification Product Instance Id Reply to Endpoint Message Product Instance Id serial iSerialNumber + Separate String Descriptor
Manufacturer SysEx Id Device Identity Notification Manufacturer Discovery and Reply to Discovery Manufacturer DeviceInfo manufacturerId
Model Family Value Model Family Model Family familyId
Model Number Value Model Number Model Number modelId
Version Value Version Version versionId

Note 1: The text string for iInterface should be very similar to or the same as UMP Endpoint Name if possible. But iInterface Name might need to include “MIDI” for identification purposes.

Note 2: If the device does not declare its UMP Endpoint name, then use the text string for iInterface from USB as a substitute. If the text string for iInterface has a “MIDI” suffix, the suffix should be removed when being presented as a UMP Endpoint.

If a USB MIDI 1.0 device is represented as a Virtual UMP device, then the text string for iInterface can be used as the virtual UMP Endpoint Name.

Note 3: MIDI 2.0 Product Instance Id Notification and USB iSerialNumber:

This property has high value for Operating Systems to track a device between power cycles or when connected to a different USB port. This also differentiates between multiple units of the same make/model device. It is sometimes difficult to add the actual serial number found on the product or box. This property is not required to match that serial number. For example, the value could be a random number generated at first powerup then written to memory. max 255 characters ASCII (recommend < 36 characters)

See this excellent article about using iSerial on Windows. The fundamental ideas presented are also generally good practice for other operating systems: https://devblogs.microsoft.com/windows-music-dev/the-importance-of-including-a-unique-iserialnumber-in-your-usb-midi-devices/

Table 2: Further Properties for Devices which have Dynamic Function Blocks

The properties in this table are generally used to describe device topology for devices which implement Dynamic Function Blocks. 

See the M2-104-UM specification for 2 ways to design a device in the relationship between USB MIDI 2.0 Group Terminal Blocks and MIDI 2.0 Function Blocks (Dynamic Function Blocks or Static Function Blocks).

Property Name

UMP Message

UMP Message Field

MIDI-CI Message

MIDI-CI Message Field

PE Resource

Property

USB MIDI 2.0 Descriptor

USB MIDI 2.0 Descriptor Field
UMP Endpoint Block Configuration
Number of Group Terminal Blocks Class-specific MS Data Endpoint Descriptor bNumGrpTrmBlock1
Number of Function Blocks Endpoint Info Notification Number of Function Blocks
Protocols Supported M2 and M1 bits
Group Terminal Block Protocol Group Terminal Block Descriptor bMIDIProtocol2
Current UMP Endpoint Protocol Stream Configuration Notification Protocol
Group Terminal Block Name String Descriptor iBlockItem + Separate String Descriptor3
For Each of One or More Function Blocks
Block Name Function Block Name Notification Function Block Name
Block ID Function Blocks Info Notification Function Block Number Reply to Discovery Function Block Number
Streaming Direction Direction
Block Active Function Block Active
Block Is MIDI 1.0 MIDI 1.0
First Group First Group
Number of Groups Spanned Number of Groups Spanned

Note 1: bNumGrpTrmBlock must be set to a single, bidirectional (bGrpTrmBlkType=0x00) Group Terminal Block starting at Group 1 (nGroupTrm=0x00) and containing 16 Groups (nNumGroupTRM=0x10)

Note 2: bMIDIProtocol should be set to 0x00 (unknown). Protocol is set by UMP Endpoint and Function Blocks.

Note 3: String for iBlockItem should be set to the same as UMP Endpoint Name and the string for iInterface.

Table 3: Further Properties for Devices which have Static Function Blocks

The properties in this table are generally used to describe device topology for devices which implement Static Function Blocks. 

See the M2-104-UM specification for 2 ways to design a device in the relationship between USB MIDI 2.0 Group Terminal Blocks and MIDI 2.0 Function Blocks (Dynamic Function Blocks or Static Function Blocks).

Property Name

UMP Message

UMP Message Field

MIDI-CI Message

MIDI-CI Message Field

PE Resource

Property

USB MIDI 2.0 Descriptor

USB MIDI 2.0 Descriptor Field
UMP Endpoint Block Configuration
Number of Blocks Endpoint Info Notification Number of Function Blocks Class-specific MS Data Endpoint Descriptor bNumGrpTrmBlock
Protocols Supported M2 and M1 bits
Current UMP Endpoint Protocol Stream Configuration Notification Protocol Group Terminal Block Descriptor bMIDIProtocol
For Each of One or More Blocks
Block Name Function Block Name Notification Function Block Name Group Terminal Block Descriptor iBlockItem + Separate String Descriptor
Block ID Function Blocks Info Notification Function Block Number Reply to Discovery Function Block Number Group Terminal Block Descriptor bGrpTrmBlkID
Streaming Direction Direction Group Terminal Block Descriptor bGrpTrmBlkType
Block Active Function Block Active
Block Is MIDI 1.0 MIDI 1.0 Group Terminal Block Descriptor bMIDIProtocol
First Group First Group Group Terminal Block Descriptor nGroupTrm
Number of Groups Spanned Number of Groups Spanned Group Terminal Block Descriptor nNumGroupTrm

Also see these developer device design articles for building USB MIDI 2.0 devices:

BUILDING A USB MIDI 2.0 DEVICE – PART 1

BUILDING A USB MIDI 2.0 DEVICE – PART 2

BUILDING A USB MIDI 2.0 DEVICE – PART 3

6 New Profile Specifications Adopted

The MIDI Association and AMEI continue to push MIDI 2.0 forward

Profiles are one of the most powerful mechanisms in MIDI 2.0 and in the past few months the MIDI Association has adopted 6 new Profile Specifications bringing the total number of adopted Profiles to 7.

We covered some of the basics of profiles in this article.

https://midi.org/what-musicians-artists-need-to-know-about-midi-2-0

Click on the link above or image below to view that article.

The MIDI Association voted to adopt the first Profile in 2021 and then followed up with 6 Profiles specifications that were adopted around NAMM 2024:

  • M2-113-UM 1-0 Default Control Change Mapping Profile
  • M2-118-UM GM Function Block Profile
  • M2-119-UM GM Single Channel Profile
  • M2-120-UM MPE Profile
  • M2-121-UM Drawbar Organ Profile
  • M2-122-UM Rotary Speaker Profile
  • M2-123-UM MIDI-CI Profile for Note On Selection of Orchestral Articulation

In this article, we will summarize each of the these Profiles and explain their benefits to people who use MIDI and why you should be encouraging companies to support these Profiles in their products in the future.


Default Control Change Mapping Profile

Many MIDI devices are very flexible in configuration to allow a wide variety of interaction between devices in various applications. However, when 2 devices are configured differently, there can be a mismatch that reduces interoperability.

This Default Control Change Mapping Profile defines how devices can be set to a default state, aligned with core definitions of MIDI 1.0 and MIDI 2.0. In particular, devices with this Profile enabled have the assignment of Control Change message destinations/functions set to common, default definitions.

One of the core challenges with MIDI 1.0 was that there were fewer than 128 Controller Change messages (because the MIDI 1.0 Control Change list also included Mode Changes messages, Program and Bank Change and some other very fundamental mechanisms tied to physical controllers like Foot Pedals.

On top of that the Reset All Controller message in MIDI 1.0 is interesting because it does NOT reset all controllers (only some of them) and only resets the value of the controller, not it’s destination.

In otherwords, a Reset All Controller messages will reset the value of CC74 to a default value, but not what CC74 is mapped to.

For historical reasons, many products come with default MIDI mappings setting often used parameters to the CC messages used to control General MIDI modules.

One of the first and best examples of a hardware controller that used these typical assignments was the Phat Boy by Keyfax.

If you look at the front panel of the Phat Boy and the list of controllers in the Default Controller Mapping Profile, you can see exactly how these things would work together. Channel Volume (Control Change 7), Pan (Control Change 10) are also included as well as the mapping for the physical devices listed in the MIDI 1.0 specification that include Mod Wheel, Breath Controller, Foot Controller, Damper, Sustenuto, Soft Pedal and Legato FootSwitch.

You can download the Default Control Mapping Profile HERE.


The GM Function Block Profile

In the article https://midi.org/what-musicians-artists-need-to-know-about-midi-2-0 , we included this explanation of GM and Profiles.

“Actually General MIDI was an example of what a Profile could do. GM was a defined set of responses to set of MIDI messages. But GM was done before the advent of the bi-directional communication enabled by MIDI-CI.  

So in the MIDI 1.0 world,  you sent out a GM On message, but you never knew if the device on the other side could actually respond to the message.  There was no dialog to establish a connection and negotiate capabilities. 

But bi-directional commmunication allows for much better negotiation of capabilities (MIDI -CI stands for Capabilities Inquiry after all).”

Here is a quote from the Executive Summary of the GM Function Block Profile.

The General MIDI specifications were written many years before MIDI 2.0 and the concept of MIDI Profiles enabled by MIDI-Capability Inquiry (MIDI-CI). General MIDI describes a minimum number of voices, sound locations, drum note mapping, octave registration, pitch bend range, and controller usage, thereby defining a given set of capabilities to expect in a given synthesizer which claims General MIDI compatibility. This document defines how to use General MIDI 2 as a MIDI-CI Profile.

This new definition allows all the capabilities of General MIDI 2 devices to be enabled or disabled using MIDI-CI Profile Configuration messages. The MIDI-CI Profile for General MIDI 2 defines bidirectional mechanisms for devices to discover whether General MIDI 2 functionality is available on a Receiver, enabling a more reliable and predictable result from the connection between two devices.

The GM Function Block Profile can set an entire set of 16 Channels to respond to GM2 messages, but because MIDI 2.0 has 16 Groups that can expand the functionality of MIDI beyond just 16 channels, a GM Function Block Profile could be used to set 32 or 48 channels to GM. Here is a quote from the spec.

“The MIDI-CI Profile for General MIDI 2 is a Function Block Profile (see the Common Rules for Profiles
[MA04]). The Device ID: Source or Destination field in all MIDI-CI Profile Configuration messages for this GM2 Profile shall be set to 0x7F (Function Block).

The GM2 Profile supports 16 Channels on a Device connected by a MIDI 1.0 transport.

On a MIDI 2.0 transport using UMP Format, the Profile supports any multiple of 16 up to 256 Channels. When using the UMP Format, the number of Channels which will be used when the GM2 Profile is enabled is determined by the Function Block design of the Receiver. The Sender may discover the number of Channels in the Receiver’s Function Block(s) by mechanisms defined in the Universal MIDI Packet (UMP) Format and MIDI 2.0 Protocol specification [MA06].

When the Function Block spans 2 or more Groups, the rules of Channel usage from General MIDI 2 shall apply individually to each Group. For example, if the Profile is operating on Groups 7 and 8, Channel 10 of both Groups defaults to a Rhythm Channel and all other Channels default to a Melody Channel. See the General MIDI 2 specification [MA07] for rules of implementation for each set of 16 Channels (each Group).

As defined in the Common Rules for Profiles, if no Function Blocks are declared by the Receiver, then the Profile will operate on a single UMP Group (with 16 Channels) or on 16 Channels of a MIDI 1.0 connection.

You can download the GM function Block Profile HERE.


The GM Single Channel Profile

In addition to the Function Block Profile, we have added the ability to turn on GM on Single Channel.

General MIDI System Level 1 and General MIDI 2 specifications were written many years before the concept of MIDI Profiles enabled by MIDI-CI. The original specifications require support on all 16 MIDI channels. This document defines how to use MIDI-CI Profile Configuration Messages to implement the functions of a General MIDI 2 Melody Channel on a single channel.

This Single Channel GM Profile only supports the Melody Channel of GM2 as we have some other plans in the works for Drum and Percussion Profiles.


MPE Profile

Youtube Video of the MPE WG Presentation at NAMM 2024

MPE has been called by Geert Bevin, one of the major proponents of both the original MPE spec and the new MPE Profile as “the bridge between MIDI 1.0 and MIDI 2.0”. MPE uses channel rotation so that each Note can have it’s own MIDI Polyphonic Expression.

There are many MPe devices in the market and MIDI Association Member Roli has done a great job of pulling together information on all the products available.

Click on the link below to see a list of products that currently support MPE.

So what are the advantages that a Profile brings to the table? Just like GM, MPE had no bi-directional communication so two products couldn’t negotiate and share information about their capabilities.

Being able to let two products exchange information can greatly expand the interoperability between the products. We tried to add as much benefit as possible without fundementally changing how MPE currently works. Here is an Overview taken directly from the new MPE Profile Specification.

MPE Profile Overview

MPE Profile Functional Overview

The MPE Profile is a MIDI-CI Profile which conforms to the definition of a Multi-Channel Profile as defined in the M2-102-U Common Rules for MIDI-CI Profiles [MA04]

“Turning On and Enabling a Profile”.
This overview summarizes the main elements of the MPE Profile specification, additional important details can be found in later sections.

MPE Profile is switched on and configured using the following messages:

  • MIDI-CI Profile Configuration mechanisms to setup the devices and enable the MPE Profile
  • Registered Controller/RPN [0x00, 0x00] to change the Pitch Bend Sensitivity from the default MPE
    Profile value of 48 HCUs, if desired or needed.

MPE offers per-note expressive control using the following messages:

  • Note On/Off
  • Pitch Bend
  • Channel Pressure or alternatively, bipolar Registered Controller/RPN [0x20, 0x20]
  • Third Dimension of Control using Control Change #74 or, alternatively, bipolar Registered
    Controller/RPN [0x20, 0x21]

MPE uses the following mechanisms to coordinate per-note control:

  • The range of Channels over which notes are sent and received can be set by enabling this Profile. The

MIDI Channel space can be divided into multiple Zones by enabling multiple MPE Profiles with non-
overlapping Channels.

  • Each MPE Profile has a number of Member Channels for notes plus a dedicated extra Channel, called
    the Manager Channel, which conveys information common to all Active Notes in that Zone. The
    Manager Channel is always the lowest Channel in the Profile and is not used for notes.
  • Wherever possible, every note is assigned its own Channel for the lifetime of that note. This allows
    MPE messages to be addressed uniquely to that Active Note.

2.2 Differences Between MPE (v1.1) and this MIDI-CI Profile for MPE

There are a number of differences between MPE 1.1 and MPE Profile. Specifically:

  • The Profile is receiver centric. The receiver will report back the range of Channels that it can support, and the Sender will adapt.
  • The MPE 1.1 configuration parameter formerly set by RPN 6 (MCM), is now handled by enabling an
    MPE Profile using MIDI CI. See M2-102-U Common Rules for MIDI-CI Profiles [MA04].
  • Multiple Zones are realized by enabling multiple MPE Profiles with non-overlapping Channels.
  • The Manager Channel is always the lowest Channel in the Profile.
  • Notes sent on the Manager Channel are not defined by this Profile.
  • Because the Manager Channel is always the lowest Channel, the MPE 1.1 Upper Zone is not possible
    with an MPE Profile.
  • Profile Details Inquiry mechanism is used to determine the number of MIDI Channels which will be
    used by MPE and to discover the Receiver’s properties which are addressed by the MPE controls.
  • There is no need to send Pitch Bend Sensitivity to every individual Member Channel, it is sent to the
    Manager Channel only.

So here are a couple of points about the new MPE Profile.

  • The MPE Profile can be extended over more than 16 Channels
  • The MPE Profile can have multiple “Zones” because you can enable multiple Profiles as long as there are no overlapping channels. MPE is no longer limited to just 16 channels and 2 Zones.
  • Manager Channels and Member Channels clarify how Messages are used
  • The Profiles Detail Inquiry Message can discover optional features increasing interoperability between devices.
    • The following are defined as optional features:
      • MPE devices are not required to use all three core MPE Expression Controllers.
      • MPE devices may optionally and mutually exclusively use high resolution versions of Channel Pressure or Third Dimension of Control Section 4.5
      • MPE devices may optionally respond to Channel Response Type notification (Section 3.6.3)

You can download the MPE Profile Specification HERE.


Drawbar Organ Profile

About MIDI 2.0 Profiles

Several years ago, we showed this demo of several prototypes of the Drawbar Organ Profile and some people may wonder what ahs taken so long to pass the Drawbar Organ Specification. Actually several very key mechanisms had to be developed before developers would ready to implement Profiles.

First, we had to come up with and prototype the Profile Details Inquiry message. This mechanism allows two products to share what optional features are supported without having to make the user configure anything. Also we need to get the operating systems to implement new MIDI APIs to pass this messages.

Below is an image taken from the Drawbar Organ Single Channel Profile. You can see the level of detail that can included in a Profiel Details Inquiry. Once again, it is important to point out that these mechanisms are two MIDI products talking to each other and auto-configuring themselves.

You can download the Drawbar Organ Profile HERE.


Rotary Speaker Profile

Rotary Speaker Profile Specification defines a common set of basic features and MIDI messages to control a rotary speaker. The typical model is a rotating speaker cabinet with a horn and woofer that rotate at different speeds. The Device that implements the specification might be a rotating speaker or an effects unit that emulates the sound of a rotating speaker. The goal of the specification is to encourage implementation of a chosen set of MIDI messages for control for the parameters that are most common to all such Devices.

Once again, once we had the right mechanisms in place, the Rotary Speaker is a very simple Profile.

You can download the Rotary Speaker Profile HERE.


Profile for Note On Selection of Orchestral Articulation

We are working on many Profiles, but we believe that the Note On Articulation could have significant impact on the way movies and games are scored in the future. Here are two Youtube videos.

The first is an overview of the Orchestral Articulation Profile and the second is the presentation from NAMM 2024 at The MIDI Association booth.

The Orchestral Articulation Profile Presentation at NAMM 2024

Executive Summary

There are many orchestral sample libraries in the market, and they are essential for film scoring, game audio, studio, and live MIDI applications. These orchestral libraries have many kinds of articulations.

For example, a string library might have a different set of samples for every articulation including marcato, staccato, pizzicato, etc.

However, there is no industry standard method-the method for selecting these different articulations has been different for each developer. Many developers use notes at the lower end of the MIDI note range for “key switching”, but the actual keys used are different between different developers. Some developers use CC messages to switch between articulations, but again there is no industry wide consistency. Some plugin formats now have the ability for per note selection of articulations, but again the method for inputting that data is different for different applications.

It is the goal of the MIDI-CI Profile for Note On Selection of Orchestral Articulation to provide a consistent way to encode articulation information directly in the MIDI 2.0 Note On message, using the Attribute Type and Attribute Data fields.

In arriving at this Profile, a study was made of orchestral instrument families, choir, big band instruments, guitar, keyboard instruments, and various non-western instruments to evaluate the degree to which they share common performance attributes and sound production techniques. Notation symbols and performance indications were also considered to determine, for example, how successfully a violin note marked with a trill might result in a musically meaningful or analogous articulation when the part is copied to an instrument as far afield as timpani— all without the composer having to re-articulate the timpani part, at least initially.

The Profile provides a comprehensive yet concise system of articulation mapping that includes a wide palette of articulation types and supports articulation equivalence across eight instrument categories.

The Profile was designed to offer articulation equivalence — a system of articulation mapping that allows a passage articulated for one instrument to be copied to another track and played back with an equivalent or analogous articulation, regardless of the target instrument type.

When implemented by sample library developers, the Profile will greatly aid composers in highly significant ways.

First, it will simplify the process of substituting or layering sounds from the same or different sample libraries;

Second, it will allow composers to quickly audition and orchestrate unison passages by copying an articulated part to other tracks and hear them to play back with equivalent or analogous articulations.

Here is a dopwnloadable PDF of the Orchestral Articulation Profile Overview.


We are working on several important Profiles that we hope to finish in the future near future.

The Orchestral Profile will be available for download soon.


The Camera Control Profile

Camera Control Profile Demo

Video has become the premier driver of many aspects of creative production and is now often an integral part of the music production process. Video Cameras are ubiquitous and have well-defined feature sets.

They can be roughly categorized in two groups- simple cameras with basic on/off, zoom, and focus and Pan, Tilt, Zoom cameras with either mechanical motors for moving the camera lens to focus on different objects or digital mechanisms for the same type of PTZ control. 

The feature sets for these cameras are well defined, but there is no standardized control protocol. In fact, instead there are many different manufacturer specific protocols that all do very similar things.  The Camera Control Working Group analyzed the various manufacturer specific protocols, determined what features had commonality and then defined the MIDI messages to control those features.

MIDI allows people to use off the shelf MIDI controllers to also control their cameras making it easier to integrate and use these products.

Adoption of the Profile might happen in one of two ways.

  • Software could be developed that translated MIDI messages into the various manufacturer specific protocols which would allow Profile support for cameras already in the market.
  • In the future, new cameras could be shipped with MIDI support for the Profile built-in.

The Piano Profile

Just last week, we had a joint meeting of the MIDI Association and AMEI working groups on the Piano Profile. AMEI companies that are participating are Yamaha, Roland, Korg and Kawai and The MIDI Association companies include Synthogy, Steinway, Piano Disc, Kurzweil.

The two videos that follow detail the work we have been doing on the Piano Profile and also a demo of the Profile featuring Scott Tibbs.

Goal of the Piano Profile

Piano Profile Demo at NAMM 2024 featuring Scott Tibbs

We’ve Been Busy

As Luther said when he gave the parking garage attendant a three year old parking garage ticket “I’ve been busy”. Yes, we have been very busy, but are starting to see the results of our work.

Our MIDI Association volunteers are continuing to work on new Profiles specifications including a DAW control profile and multiple effects profiles.

We just started a DAW working group which has the support of Apple (Audio Units), AVID( AAX), Bitwig (CLAP) and Steinberg (VST) so we have all the plug in format companies, multiple DAW and notation companies, and plug in developers all working together to ensure that MIDI 2.0 is handled correctly now that the operating systems have completed much of their work. We are planning a face to face meeting in Berlin before SuperBooth.

The Windows Open Source Driver and API is available to developers and Apple, Google and Linux continue to improve their support for MIDI 2.0.

So if you are developer, we hope you will join us by becoming a MIDI Association member or posting on the new public forum that has an area dedicated to developers.

If you are someone who uses MIDI, there are more and more MIDI 2.0 products in the market and we are confident that 2024 will prove to be a significant year in MIDI 2.0’s history.

MIDI2.dev: Open-Source Resources for MIDI 2.0 Developers

We recently sent out a survey about our NAMM plans and one of the interesting comments came from a developer.

I’ve been following up with the development of MIDI 2.0 for a long time as an open source developer. Not having open tooling to validate the development is really sad and delays adoption by the community. Why didn’t you open the MIDI workbench or some other tooling to help developers simulate devices, inspect communications and validate our software/hardware yet?

from the recent MIDI Association survey

The MIDI Association and our members agree. After much internal discussion, we decided the best way to accomplish this goal was to have a number of our members create a website specifically for open source resources.

Key contributors to the MIDI 2.0 specification open a new website and github with open source resources for developing MIDI 2.0 products.

Mike Kent, chair of the MIDI 2 Working Group, his partner Michael Loh, chair of the OS API Working Group, Andrew Mee, recently elected Technical Standard Board member and Chair of the Developer Support Working group, and Franz Detro of Native Instruments formed a consortium of individuals whose aim is to promote the adoption of MIDI 2.0 by providing tools and libraries useful to developers. Resources include C++ Libraries, USB helpers, testing applications, and more from leading contributors to the MIDI 2.0 specifications.

Here is an overview of the resources available at https://midi2.dev/. These are available under permissive open-source licenses. 


MIDI WORKBENCH MIDI 2.0 TESTING AND COMPLIANCE TOOL

The MIDI Workbench MIDI 2.0 is a Standalone Electron Application for complete MIDI 2.0 environment. This workbench uses UMP internally and translates MIDI 1.0 (including MPE) to and from MIDI 2.0. 


...

GitHub – midi2-dev/MIDI2.0Workbench: MIDI 2.0 Testing Tool

MIDI 2.0 Testing Tool. Contribute to midi2-dev/MIDI2.0Workbench development by creating an account on GitHub.

This Project aims to follow the current public MIDI specifications. This project is supported and copyrighted by Yamaha Corporation 2020 and provided under an MIT license. 


AmeNote ProtoZOA

AmeNote ProtoZOA is a flexible prototyping tool for MIDI 2.0. Open source firmware provides MIDI 2.0 interfaces and functions for developers to use in their own hardware and software products.

  • MIDI 2.0 with MIDI-CI Discovery
  • USB MIDI 2.0 Class Compliant Device
  • Universal MIDI Packet Data Format
  • MIDI 1.0 <-> MIDI 2.0 Translation
  • MIDI 1.0 Input / Output Ports
  • Expansions for MIDI 2.0 UMP Network, A2B Network, CME BLE MIDI, SPI, UART
  • Open-Source Code on Raspberry Pico
  • PicoProbe for integrated debugging

AmeNote developed and provides a USB MIDI 2.0 Class Compliant Device on ProtoZOA, designed specifically to jump-start prototyping and validation of UMP functions and fuel the MIDI 2.0 revolution. ProtoZOA encourages speedier adoption of MIDI 2.0 by:

  • Providing an affordable, flexible prototyping platform to enable software and hardware developers to start testing and prototyping MIDI 2.0.
  • Providing a testing platform which connects via the USB MIDI 2.0 drivers recently released by Apple and Google and a test tool for Microsoft as AmeNote develops Windows host drivers.
  • Provide USB MIDI 2.0 source code that other hardware developers can use under a no-charge permissive license.

The MIDI Association helped to fund the development of the Protozoa and made it available to dues-paying members of the MIDI Association and the Association of Musical Electronics Industry. Some 60+ member companies have been using ProtoZOA for over a year. Now AmeNote is making ProtoZOA available to all non-member developers. The Protozoa is available for purchase on the Amenote website in a variety of configurations.


AM_MIDI2.0Lib

This is a general purposes Library for building MIDI 2.0 Devices and Applications. This library is targeted to work on everything from Arduinos through to large scale applications. It provides foundational building blocks, processing, and translations needed for most MIDI 2.0 Devices and Applications. 


ni-midi2

The library provides the basic functionality of UMP 1.1 and MIDI-CI 1.2 by providing base classes for all UMP 1.1 packet types, (Universal) System Exclusive messages and MIDI-CI messages. There are concrete types for controllers, velocity and pitch, plus type aliases for common message field types. Mathematical operators allow to do integer / fixed point math on pitches and controllers, type constructors allow initialization with values of different resolution. Conrete instances of packets or messages are created using factory functions. Incoming packets and messages are inspected using data views. The library is completed by a number of helper functionalities dealing with conversion from / to MIDI 1 byte stream data format, collecting sysex messages and more.


AmeNote tusb_ump Device Driver for tinyUSB

AmeNote Universal MIDI Packet (UMP) USB MIDI 2.0 / USB MIDI 1.0 Class Driver for tinyUSB Device

As part of the midi2.dev community and with the intention of contributing back to tinyUSB, AmeNote is providing a class driver for USB MIDI (either 1.0 or 2.0) for tinyUSB USDB Device integrations. With the correct descriptors and configuration, the driver will enable embedded devices using the tinyUSB embedded library.

The driver is compatible with hosts selecting either USB MIDI 2.0 or USB MIDI 1.0 standard. If USB MIDI 1.0, the driver will translate between USB MIDI 1.0 and UMP packets. All embedded applications will interface using UMP packet data structure as a series of 32-bit words.

TODO: Example code and integration guide will be provided


usbMIDI2DescriptorBuilder

Andrew Mee has created a tool to help correctly build USB descriptors. USB descriptors can be quite difficult for developers to get right and this tool will create USB MIDI 2.0 Descriptors for the tinyUSB and Linux MIDI2 Gadget Driver.


These MIDI 2.0 developer resource efforts highlight what is truly unique about the MIDI Association.

Since MIDI’s genesis, companies that are traditionally fierce competitors have chosen cooperation as the best way to solve difficult problems—and to contribute those solutions at no charge back to the global community, for the benefit of musicians, artists, and engineers everywhere.

by MIDI Association president, Athan Billias.


Microsoft’s Presentation at ADC 2023


Technical architecture diagram.

If you are developer of MIDI products, you will definitely want to check out this in depth look at the significant upgrades happening to MIDI in an upcoming version of Windows from Microsoft MIDI evangelist and chair of The MIDI Association Exec Board Pete Brown.




...

GitHub – microsoft/MIDI: Windows MIDI Services

Windows MIDI Services. Contribute to microsoft/MIDI development by creating an account on GitHub.

New MIDI 2.0 Products Released-November, 2023


In the past few weeks, a number of MIDI Association member companies have released MIDI 2.0 products.  

In this article, we’ll give a brief overview of those new MIDI 2.0 products, describe what their MIDI 2.0 features are and try to give an accurate picture of the MIDI 2.0 market. 

The MIDI Association is currently undertaking a complete redesign of the MIDI.org website.  By Winter NAMM 2024, our plan is to have a searchable database of MIDI 2.0 products. Companies will be able to upload information on new MIDI 2.0 products and MIDI users will be able to search for MIDI 2.0 products that work together.  

MIDI 2.0 products can have different features sets.  Some that support MIDI-CI Profiles and Property Exchange will work on MIDI 1.0 transports including 5 Pin DIN.  Some require an operating system that supports MIDI 2.0 high resolution Voice Channel Messages, and some even need other products that support the same feature set to take full advantage of MIDI 2.0 features (for example, the demo at The MIDI Association booth in April 2023 included a Roland A88MKII, an Apple computer running Logic and AU and Synthogy’s Ivory Piano softsynth. 


Native Instruments Kontrol S-Series MK3

Native Instruments has released the Kontrol S-Series MK3 which supports a number of MIDI 2.0 features. It supports USB MIDI 2.0, the new Universal MIDI Packet (UMP), MIDI 1.0 Voice Channel Messages in UMP format, and MIDI-CI Property Exchange including Channel List. It also supports Function Blocks.  What is important to note is that even though it supports USB MIDI 2.0, it also has 5 PIN Din MIDI In and Out connectors.  

If you have a Mac Computer that is running any operating system after Monterey, the internal native format for MIDI in the operating system is MIDI 2.0 High Resolution Voice Channel Messages.

When you plug in your MIDI 1.0 keyboard or interface, the OS is smart enough to know it is a MIDI 1.0 device and translates the internal MIDI 2.0 messages into MIDI 1.0 messages that your keyboard can understand.

People often worry if MIDI 2.0 will force them to buy a new cable, new software or new products. But literally for the last 2 years, people who owned Macs and updated their operating system have been using “MIDI 2.0”, they just didn’t know it or notice it because we spent so much time and effort making sure that things were backwards compatible.

So go ahead, connect a Sequential Circuits Prophet 600 or a Roland Jupiter 6 (the first two products ever connected by MIDI back in 1983) to your NI Kontrol S Series MK3 and they will still work just fine!

The S-Series MK3 features a large full color display, polyphonic aftertouch, and light guides for Smart Play. It also features software integration via NI’s next generation of NKS technology. 

Features

  • Unparalleled software integration with next-generation NKS technology for a direct and immersive connection to NI and partner instruments, effects, and sounds
  • Choose between 49 or 61 semi-weighted keys, or 88 fully weighted hammer-action keys, all in an industry-leading Fatar keybed
  • All three models come with polyphonic aftertouch as standard, plus Kontrol S88 is the first widely available hammer-action polyphonic aftertouch keyboard controller
  • High-resolution full-color display screen for intuitive browsing, tweaking, and mixing
  • Light Guide: RGB lights above each key highlight drum cells, key switches, chords, scales, and more
  • Smart Play: See scales and modes on the Light Guide, play chord progressions and arpeggios with single keys, or map any scale to white keys only
  • Anodized aluminum knobs and pitch and modulation wheels

by Native Instruments


Korg Keystage

Korg has released the Keystage controller which supports MIDI 2.0 via a manufacturer specific MIDI-CI Property Exchange including Channel List. 

In designing MIDI 2.0, we wanted to allow companies to innovate without needing to get something standardized by the entire industry. So both Profiles and Property Exchange allow companies to use their companies SysEx ID to define a manufacturer specific Profile or Property Exchange.  

That is exactly what Korg did with the Keystage.  They use MIDI-CI Property Exchange to send JSON data between the computer and the Keystage.  They can subscribe parameters so the communication is bi-directional and always gets updated. Because JSON is in human readable format, it can be easily displayed on the screen of the controller. 

What’s important is that once companies have developed the core software for doing MIDI-CI that MIDI 2.0 software can be re-used in future products. 

Keystage is at the forefront of keyboard innovation, being the first to truly tap into the potential of MIDI 2.0. Simply connect the keyboard to compatible software, and Keystage automatically assigns parameters and even displays their names on its crystal-clear OLED screens. Each screen is paired with a dedicated knob for real-time adjustments, meaning you always see the parameters assigned and their exact values, empowering you to make informed decisions and act on your musical instincts without breaking your creative stride.

by Korg


Roland A88 MKII Firmware Update

Roland announced the A88 MKII in 2020 and said that it was MIDI 2.0 ready and that the keyboard was capable of high resolution velocity and the and knobs were capable of sending 32 bit controllers. They have now made good on that promise by releasing a firmware update. 

Roland is pleased to announce the immediate availability of a free MIDI 2.0 update for the A-88MKII MIDI Keyboard Controller. When Roland first introduced the A-88MKII at the Consumer Electronics Show (CES) in January 2020, the hardware was designed to be “ready for MIDI 2.0.” Now in 2023, MIDI 2.0 support is included with or coming very soon for all major computer and device operating systems.

With the free MIDI 2.0 update installed, A-88MKII owners will be ready to take advantage of dramatically increased musical expressivity as compatible instruments and apps become available. Pianos will perfectly reflect the most subtle keyboard playing techniques, orchestral sounds will be more acoustic and lifelike, and synthesizer textures will respond to performance gestures like never before.

Speaking to the A-88MKII MIDI 2.0 update, Roland Vice President of Customer Experience (CX) Paul McCabe notes, “Roland was a pioneering contributor to the original MIDI Standard over 40 years ago and has been deeply involved in the development of MIDI 2.0 since day one. Bringing MIDI 2.0 to life for musicians worldwide has long been a high priority for us, and we are pleased to make this happen with the free update for our A-88MKII MIDI Keyboard Controller.”

by Roland

Once again, we should point out that the A88 MKII which now supports MIDI 2.0 still includes good old 5 Pin DIN plugs.  Now that companies are actually releasing products, people should feel very comfortable with the fact that MIDI 2.0 and MIDI 1.0 products will live side by side in their studios with no issues.


DAWs and Plugins Support for MIDI 2.0

There are now a number of DAWs on the market that have support for MIDI 2.0. 

MultitrackStudio supports MIDI 2.0 and MIDI 2.0 VCM, MIDI-CI, Property Exchange Program List, MIDI 2.0 VCM with Articulations, 

Logic supports MIDI 2.0 CVM and those can be passed to AU plugins, 

Steinberg’s Cubase 13 and Nuendo 13 translate some MIDI 2.0 messages into VST3 format and supports high resolution velocity, CC, aftertouch, pitch bend, and poly pressure data. According to the Steinberg website both Nuendo 13 and Cubase 13 are ready for the widespread adoption of MIDI 2.0.

Ivory 3 supports High Resolution Velocity MIDI 2.0 Voice Channel Messages.

So does the Supreme Drums Taiko. 


What’s next?

 It’s clear that there will soon be many, many MIDI 2.0 products in the market.  

As we mentioned before, we are working on a redesign of the MIDI.org website that will allow companies to enter their MIDI 2.0 products into a searchable database.  If you want a preview of what that might look like just pop over to the MIDI Innovation Awards pages.  Those entries were all done by the entrants filling out a web form with images, Youtube links and defined text fields that we can then use as dynamic content that automatically builds web pages after the entry is approved. 

Next week, we will do another article that focuses on the tools and open source software that The MIDI Association and MIDI Association members are making available to support MIDI 2.0 developers. 


Building a USB MIDI 2.0 Device – Part 2

By Andrew Mee in collaboration with the OS API Working Group.

This is the second part of a series demonstrating how a developer may go about building a USB MIDI 2.0 device. You should read Part 1 before this Part 2. Part 1 is here: https://www.midi.org/midi-articles/building-a-usb-midi-2-0-device-part-1

In the last part we created a simple set of USB MIDI 2.0 descriptors for a synthesizer. The “ACMESynth” at this stage has one function, the “Monosynth”, and a USB MIDI 2.0 Group Terminal Block was declared for this function.

In this second guide we will cover:

  • UMP Discovery handling
  • Function Blocks and how to expand them

UMP Discovery Handling

 Our hypothetical ACMESynth is designed to be used primarily as a tone generator connected to a DAW. The goals of discovery are:

  • We want the DAW to know which UMP Group is in use and information about this Group.
  • We want the DAW or Operating System (OS) to use MIDI 2.0 Channel Voice messages. (Another article will cover handling of MIDI 1.0 Protocol, MIDI 2.0 Protocol, and translation.)
  • We want the USB device to provide names and identifiers so that the DAW can store local information about the ACMESynth to help the user when they reload a DAW session.

The UMP and MIDI 2.0 Protocol ver 1.1 specification defines discovery mechanisms so that devices can interrogate each other and present the user options on the best way to connect to a device to achieve the goals listed above.

Note: MIDI-CI is used additionally to discover more information about a device which we will be discussing in another article.

Our device needs to be able to respond to the following Groupless messages:

  • Endpoint Discovery Message
  • Stream Configuration Request
  • Function Block Discovery Message

Let’s relook at the data work our ACMESynth and add the Device Identity data we need:

Detail Value
Manufacturer Name “ACME Enterprises”
Product Name “ACMESynth”
Product Instance Id “ABCD12345”
Protocol MIDI 2.0 Protocol
(with a fallback to MIDI 1.0 Protocol)
Manufacturer SysEx Id 0x7E 0x00 0x00This example uses the Research Manufacturer System Exclusive Id set by MIDI Association. This System Exclusive Id is often shown as a single byte. However in many MIDI 2.0 Messages single byte id’s are transmitted using 3 bytes. Manufacturers should contact the MIDI Association to get their own Id.
Model Family Id 0x01 0x00
Model Id 0x11 0x22Family Id and Model Id are defined by the Manufacturer and often messages define that these are LSB first. Please review MIDI Specifications for a detailed explanation.
Version 0x01 0x00 0x00 0x00
Function 1
Function 1 Name “Monosynth”
Function 1 Channels Needed 1 Channel at a time used bidirectionally
Note: This guide shows UMP messages similar to the UMP and MIDI 2.0 Protocol specification. It is recommended that developers use a library such as my AM_MIDI2.0Lib C++ library or ni-midi2 library to help with parsing and creating the UMP messages correctly.

Responding to the Endpoint Discovery Message

One of the first UMP messages that your device may receive is the Endpoint Discovery Message. This example is asking for all information available about the UMP Endpoint.



Endpoint Info Notification Message (filter & 0b1 == true)

In response to the above message, the ACMESynth formulates an Endpoint Info Notification Message:



The Endpoint Info Notification is declaring that the ACMESynth supports both MIDI 1.0 and MIDI 2.0 Protocols and that it has one static Function Block. This is reflected by the Group Terminal Block discussion in part 1.

Device Identity Notification Message (filter & 0b10 == true)

In response to the above message, the ACMESynth formulates an Device Identity Notification Message:



Endpoint Name Notification Message (filter & 0b100 == true)

In response to the above message, the ACMESynth formulates an Endpoint Name Notification Message:



Product Instance Id Notification Message (filter & 0b1000 == true)

In response to the above message the ACMESynth formulates a Product Instance Id Notification Message:



Stream Configuration Notification Message:

While the Endpoint Info Notification messages inform the other UMP Endpoint of its support for different Protocols and JR Timestamps, the Stream Configuration Notification informs the other UMP Endpoint of its current Protocol and JR Timestamp configuration.

In our startup state of the ACMESynth it will be using MIDI 1.0 Protocol. JR Timestamp is not supported so it will set this to off.




Responding to the Stream Configuration Request

The device connected to the ACMESynth may wish to change from MIDI 1.0 Protocol to MIDI 2.0 Protocol. It achieves this be sending a Stream Configuration Request:



If ACMESynth agrees with this change then it will change its Protocol in use an send a Stream Configuration Notification Message in response:



If ACMESynth was unable to change with the Protocol change it should send a response with the Protocol set to 0x01.


Responding to the Function Block Discovery

The Endpoint Info Notification message declares how many Function Blocks ACMESynth has. A separate message is received that asks us to respond with the Function Block information:



The Function Block Number (FB#) could have also been 0x00 to represent the first (and only) Function Block. This request is also asking to return both the Info and the Name of the Function Block. This request result in two replies:

Function Block Info Notification



This Function Block is providing the direction (0xb11 – bidirectional), an indication if the Function block represents a MIDI 1.0 (0b00 – Not MIDI 1.0) and Groups in use.

As Function Blocks provide more information than USB Group Terminal Blocks it also provides

  • a UI Hint (0b01 – primarily a Receiver or destination for MIDI messages) – While our Monosynth supports bidirectional MIDI Messages, it mainly acts as a Tone Generator. A DAW can look at this field and have a better understanding and representation of the Devices connected.
  • a the MIDI-CI Version (0x00 none or unknown) – currently set to none. Later articles will look at MIDI-CI support and how this value is affected.
  • Max number of SysEx8 Streams – Our Monosynth does not support SysEx8 so this is set to zero.
 

Function Block Name Notification




Function Blocks and How to Expand Them

So far in the ACMESynth we have only declared a single Monosynth function. However many devices may require more than one function. Let’s take our device and add 2 more functions, a “MIDI IN” Din Port, and a “MIDI OUT” Din Port:



Let’s assume that the reason to add these functions is that we want our MIDI Application to use these DIN ports as a USB MIDI Adaptor for external MIDI 1.0 gear.

Our table of functions for the ACMESynth now looks like:

Detail Value
Function 1
Function 1 Name “Monosynth”
Function 1 Channels Needed 1 Channel at a time used bidirectionally
Function 1 Groups used Only one Group is used on Function 1
Function 2
Function 2 Name “MIDI IN”
Function 2 Channels Needed 16 Channel at a time may used in one direction (inwards)
Function 2 Groups used Only one Group is used on Function 2
Function 3
Function 2 Name “MIDI OUT”
Function 3 Channels Needed 16 Channel at a time may used in one direction (outwards)
Function 3 Groups used Only one Group is used on Function 3

By making this change our Endpoint Info Notification Message should declare three Function Blocks:



And our Function Block Info Notification and Function Block Name messages also needs to handle the two new Function Blocks:

Function Block 2: Function Block Info Notification



Pay attention to the settings of MIDI 1.0, UI hint and Direction fields. For this Function Block we have set the direction as Input and the UI Hint also as Input.



Function Block 3: Function Block Info Notification



Function Block 3: Function Block Name Notification




What about the USB Group Terminal Blocks?

To make the USB Group Terminal Blocks reflect our new set of static Function Blocks we should extend the Group Terminal Block descriptor from part 1 (which already contained a bidirectional Group Terminal Block for the Monosynth) with the following:

Detail Meaning Value
bGrpTrmBlkID Block Id 2
bGrpTrmBlkType Block Type 0x01 – IN Group Terminals Only
nGroupTrm Group Terminal Block Start 0x01 – Group 2
nNumGroupTrm Number of Group Blocks 1
iBlockItem Function 2 Name Id of String Descriptor Referenced Value – “MIDI IN”
bMIDIProtocol Block Protocol* 0x01 (MIDI 1.0 Protocol)
Detail Meaning Value
bGrpTrmBlkID Block Id 3
bGrpTrmBlkType Block Type 0x02 – OUT Group Terminals Only
nGroupTrm Group Terminal Block Start 0x01 – Group 2
nNumGroupTrm Number of Group Blocks 1
iBlockItem Function 3 Name Id of String Descriptor Referenced Value – “MIDI OUT”
bMIDIProtocol Block Protocol* 0x01 (MIDI 1.0 Protocol)

USB Endpoints under the Interface will also need the following updates to their values values:

IN Endpoint:

Detail Meaning Value
bNumGrpTrmBlock   2
baAssoGrpTrmBlkID[0] Block Id 1
baAssoGrpTrmBlkID[1] Block Id 3 (MIDI OUT)

OUT Endpoint:

Detail Meaning Value
bNumGrpTrmBlock   2
baAssoGrpTrmBlkID[0] Block Id 1
baAssoGrpTrmBlkID[1] Block Id 2 (MIDI IN)

Note: you may also wish to update the USB MIDI 1 Descriptors to have these new functions on separate MIDI 1.0 Ports.


Static vs Non-Static Function Blocks

Up to this point we have been using static Function Blocks when declaring the functions of ACMESynth. When connecting to a DAW that is the central manager and handling the routing of MIDI Messages this is unlikely to present any problems.



This is because a DAW is likely to adapt to the devices connected to it. In this case the ACMESynth declares it has the Monosynth on Group 1 and the DAW will send MIDI Messages on Group 1.

However static Function Blocks have limitations when connecting between other devices. Much like when two devices that connect to each other must use the same channels – two UMP enabled devices must use the same Groups (and Channels) to communicate effectively.

Function Blocks also have the ability (unlike USB Group Terminal Blocks) to overlap. For example we may want a setup where the Monosynth and MIDI IN and MIDI Out functions all use the same Group. By using non-static Function Blocks the user can move these functions to the same Group.

This ability to reconfigure Function Blocks may become more important with other (upcoming) UMP transports.


Group Terminal Blocks for Non-Static Function Blocks

When using static Function Blocks it is easy to see that having matching USB Group Terminal Blocks makes sense. When Function Blocks are non-static it is best to have one bidirectional Group Terminal Block that covers all 16 Groups.

Detail Meaning Value
bGrpTrmBlkID Block Id 1
bGrpTrmBlkType Block Type 0x00 – Bidertional
nGroupTrm Group Terminal Block Start 0x00 – Group 1
nNumGroupTrm Number of Group Blocks 16
iBlockItem Function 2 Name Id of String Descriptor Referenced Value – “ACMESynth”
bMIDIProtocol Block Protocol 0x11 (MIDI 2.0 Protocol)

4Endpoint Info Notification Message should declare three non-static Function Blocks:



As macOS only provides MIDI 1.0 compatibility to Groups declared by a Group Terminal this allows Functions blocks to move around freely to different Groups while still allowing MIDI 1.0 compatibility.

The downside of this is that all 16 Groups are presented as MIDI 1.0 ports even though only a handful may be connected to an internal function like the Monosynth.

Linux works somewhat differently in that all UMP connections automatically set up all 16 Groups as ALSA ports for MIDI 1.0 compatibility. These ALSA ports are then displayed to the user only if they are active. When a USB MIDI 2.0 device is first connected, it will activate the ALSA Ports based on Group Terminal Block information. It will then attempt to retrieve the current Function Blocks and then update the list of active ALSA Ports. These ALSA ports are then updated immediately based on any Function Block changes.


What to look at next…

In part 3 of this series we are going to look at how to handle advanced USB set-ups, and also look at other gotchas that a developer needs to be aware of.


Microsoft Adds MIDI 2.0, Researches AI Text-to-MIDI in 2023

The MIDI Association has enjoyed an ongoing partnership with Microsoft, collaborating to ensure that MIDI software and hardware play nicely with the Windows operating system. All of the major operating systems companies are represented equally in the MIDI Association, and participate in standards development, best practices, and more to help ensure the user experience is great for everyone.

As an AI music generator enthusiast, I’ve taken a keen interest in Microsoft Research (MSR) and their machine learning music branch, where experiments about music understanding and generation have been ongoing.

It’s important to note that this Microsoft Research team is based in Asia and enjoys the freedom to experiment without being bound to the product roadmaps of other divisions of Microsoft. That’s something unique to MSR, and gives them incredible flexibility to try almost anything. This means that their MIDI generation experiments are not necessarily an indication of Microsoft’s intention to compete in that space commercially.

That being said, Microsoft has integrated work from their research team in the past, adding derived features to Office, Windows, and more, so it’s not out of the question that these AI MIDI generation efforts might some day find their way into a Windows application, or they may simply remain a fun and interesting diversion for others to experiment with and learn from.

The Microsoft AI Music research team, operating under the name Muzic, started publishing papers in 2020 and have shared over fourteen projects since then. You can find their Github repository here.

The majority of Muzic’s machine learning efforts have been based on understanding and generating MIDI music, setting them apart from text-to-music audio generation services like Google’s MusicLM, Meta’s MusicGen, and OpenAI’s Jukebox.

On May 31st, Muzic published a research paper on their first ever text-to-midi application, MuseCoco. Trained on a reported 947,659 Standard MIDI files (a file format which includes MIDI performance information) across six open source datasets, developers found that it significantly outperformed the music generation capabilities of GPT-4 (source).

It makes sense that MuseCoco would outperform GPT-4, having trained specifically on musical attributes in a large MIDI training dataset. Details of the GPT-4 prompt techniques were included on figure 4 of the MuseCoco article, shown below. The developers requested output in ABC notation, a shorthand form of musical notation for computers.

Text to MIDI prompting with GPT-4

I have published my own experiments with GPT-4 music generation, including code snippets that produce MIDI compositions and will save the MIDI files locally using JS Node with the MidiWriter library. I also shared some thoughts about AutoGPT music generation, to explore how AI agents might self-correct and expand upon the short duration of GPT-4 MIDI output.

Readers who don’t have experience with programming can still explore MIDI generation with GPT-4 through a browser DAW called WavTool. The application includes a chatbot who understands basic instructions about MIDI and can translate text commands into MIDI data within the DAW. I speak regularly with their founder Sam Watkinson, and within the next months we anticipate some big improvements.

Unlike WavTool, there is currently no user interface for MuseCoco. As is common with research projects, users clone the repository locally and then use bash commands in the terminal to generate MIDI data. This can be done either on a dedicated Linux install, or on Windows through the Windows Subsystem for Linux (WSL). There are no publicly available videos of the service in action and no repository of MIDI output to review.

You can explore a non-technical summary of the full collection of Muzic research papers to learn more about their efforts to train machine learning models on MIDI data.

Although non-musicians often associate MIDI with .mid files, MIDI is much larger than just the Standard MIDI File format. It was originally designed as a way to communicate between two synthesizers from different manufacturers, with no computer involved. Musicians tend to use MIDI extensively for controlling and synchronizing everything from synthesizers, sequencers, lighting, and even drones. It is one of the few standards which has stood the test of time.

Today, there are different toolkits and APIs, USB, Bluetooth, and Networking transports, and the new MIDI 2.0 standard which expands upon what MIDI 1.0 has evolved to do since its introduction in 1983.

MIDI 2.0 updates for Windows in 2023

While conducting research for this article, I discovered the Windows music dev blog where it just so happens that the Chair of the Executive Board of the MIDI Association, Pete Brown, shares ongoing updates about Microsoft’s MIDI and music efforts. He is a Principal Software Engineer in Windows at Microsoft and is also the lead of the MIDI 2.0-focused Windows MIDI Services project. 

I reached out to Pete directly and was able to glean the following insights.

Q: I understand Microsoft is working on MIDI updates for Windows. Can you share more information?

A: Thanks. Yes, we’re completely revamping the MIDI stack in Windows to support MIDI 2.0, but also add needed features to MIDI 1.0. It will ship with Windows, but we’ve taken a different approach this time, and it is all open source so other developers can watch the progress, submit pull requests, feature requests, and more. We’ve partnered with AMEI (the Japan equivalent of the MIDI Association) and AmeNote on the USB driver work. Our milestones and major features are all visible on our GitHub repo and the related GitHub project.

Q: What is exciting about MIDI 2.0?

A: There is a lot in MIDI 2.0 including new messages, profiles and properties, better discovery, etc., but let me zero in on one thing: MIDI 2.0 builds on the work many have done to extend MIDI for greater articulation over the past 40 years, extends it, and cleans it up, making it more easily used by applications, and with higher resolution and fidelity. Notes can have individual articulation and absolute pitch, control changes are no longer limited to 128 values (0-127), speed is no longer capped at the 1983 serial 31,250bps, and we’re no longer working with a stream of bytes, but instead with a packet format (the Universal MIDI Packet or UMP) that translates much better to other transports like network and BLE. It does all this while also making it easy for developers to migrate their MIDI 1.0 code, because the same MIDI 1.0 messages are still supported in the new UMP format.

At NAMM, the MIDI Association showcased a piano with the plugin software running in Logic under macOS. Musicians who came by and tried it out (the first public demonstration of MIDI 2.0, I should add) were amazed by how much finer the articulation was, and how enjoyable it was to play.

Q: When will this be out for customers?

A: At NAMM 2023, we (Microsoft) had a very early version of the USB MIDI 2.0 driver out on the show floor in the MIDI Association booth, demonstrating connectivity to MIDI 2.0 devices. We have hardware and software developers previewing bits today, with some official developer releases coming later this summer and fall. The first version of Windows MIDI Services for musicians will be out at the end of the year. That release will focus on the basics of MIDI 2.0. We’ll follow on with updates throughout 2024.

Q: What happens to all the MIDI 1.0 devices?

A: Microsoft, Apple, Linux (ALSA Project), and Google are all working together in the MIDI association to ensure that the adoption of MIDI 2.0 is as easy as possible for application and hardware developers, and musicians on our respective operating systems. Part of that is ensuring that MIDI 1.0 devices work seamlessly in this new MIDI 2.0 world.

On Windows, for the first release, class-compliant MIDI 1.0 devices will be visible to users of the new API and seamlessly integrated into that flow. After the first release is out and we’re satisfied with performance and stability, we’ll repoint the WinMM and WinRT MIDI 1.0 APIs (the APIs most apps use today) to the new service so they have access to the MIDI 2.0 devices in a MIDI 1.0 capacity, and also benefit from the multi-client features, virtual transports, and more. They won’t get MIDI 2.0 features like the additional resolution, but they will be up-leveled a bit, without breaking compatibility. When the MIDI Association members defined the MIDI 2.0 specification, we included rules for translating MIDI 2.0 protocol messages to and from MIDI 1.0 protocol messages, to ensure this works cleanly and preserves compatibility.

Over time, we’d expect new application development to use the new APIs to take advantage of all the new features in MIDI 2.0.

Q: How can I learn more?

A: Visit https://aka.ms/midirepo for the Windows MIDI Services GitHub repo, Discord link, GitHib project backlog, and more. You can also follow along on my MIDI and Music Developer blog at https://devblogs.microsoft.com/windows-music-dev/ . To learn more about MIDI 2.0, visit https://midi.org .

If you enjoyed this article and want to explore similar content on music production, check out AudioCipher’s reports on AI Bandmates, Sonic Branding, Sample managers and the latest AI Drum VSTs

Building a USB MIDI 2.0 Device – Part 1

By Andrew Mee in collaboration with the OS API Working Group

 

USB MIDI 2.0 was released by the USB-IF in June 2020, with Apple adding support within CoreMIDI in October 2021 and Google added support in Android in August 2022. At the time of writing, Microsoft has announced upcoming support for MIDI 2.0 and now on a public Github, and also patches have been submitted by ALSA for inclusion in Linux Kernel 6.5. An update to the MIDI 2.0 UMP specification was approved in the first half of 2023.
For a more complete timeline see https://www.midi.org/midi-articles/detailed-timeline-of-midi-2-0-developments-since-january-2020

This technical guide to building a USB MIDI 2.0 device is the first in a series of articles targeted specifically to device developers.
For musicians, please see this article: https://www.midi.org/midi-articles/what-musicians-and-artists-need-to-know-about-midi-2-0

This series (based on the work of the OS API Working Group of the MIDI Association) focuses on configuring the USB descriptors to provide the best experience for your users. It shows you how to handle Group Terminal Blocks and Function Blocks, Multiple UMP Endpoints and compatibility with USB Hosts that can only handle USB MIDI 1.0, as well MIDI 1.0 Applications.

This guide assumes that the reader is familiar with the following specifications:

  • Universal MIDI Packet (UMP) Format and MIDI 2.0 Protocol v1.1
  • MIDI Capability Inquiry (MIDI-CI) v1.2
  • Universal Serial Bus Device Class Definition for MIDI Devices v2.0 (USB MIDI 2.0)

 


 

Planning your MIDI 2.0 Device

 

In MIDI 1.0 most Devices present an IN port and an OUT port and that is all that is required.
In MIDI 2.0 there are several factors to consider:

  • What are the details of your Device?
    This includes the product name and other similar details.
  • How many functions does your Device have?
    Initially think of functions as destinations and/or sources in your Device. For example a simple single channel mono synthesizer has a tone generator – this is one function. However this hardware Device may also have external MIDI IN/OUT Din ports – this could be classed as more functions. A Workstation may have many more functions.
    Note: Ultimately, these functions are represented by Function Blocks, which should drive your Group Terminal Block design. But for purposes of this article, we’ll cover only the USB descriptors, and the Group Terminal Blocks
  • How many channels are needed for each function?
    With the ability to utilize more than 16 Channels a multitimbral tone generator may have 32 Channels (or indeed up to 256 channels!)
  • Do you want/need the user to access all 256 channels or just a subset?
    For example maybe the tone generator can be accessed on any of the 256 Channels
  • How do you want these functions accessed when using MIDI 1.0?
    This is explained in greater detail below.
  • What MIDI 2.0 features are used for this function?
    MIDI-CI, MIDI 2.0 Protocol, JR Timestamps etc

 


 

Design of a simple Desktop Monosynth

 

Let’s imagine we have a simple single channel desktop monosynth. We want to use MIDI 2.0 Protocol where we can because the parameters (e.g. filter cutoff frequency) benefit from having more than 128 steps. While the MIDI 2.0 Protocol boasts a massive improvement in resolution and capabilities and allows us to future-proof the product, we also need to have MIDI 1.0 compatibility for both older OS’s and MIDI 1.0 Applications.

 

 

 

 

First, we start gathering the details of the synth. These values will be repeated into several different fields. They have been color-coded so you can easily refer to the source of information.

 

Detail Value String Rules
Manufacturer Name “ACME Enterprises” UTF16, Max Length: 254 bytes
Product Name “ACMESynth” UTF-8, Max Length: 98 bytes
Product Instance Id “ABCD12345”

The Product Instance Id of a device is any
unique identifier the Device has. This may be
Microcontroller Id or a built in MAC address.
Please read Pete Brown’s excellent article on why this is critical.

ASCII, Max Length: 42 bytes

don’t include characters that are:

Less than or equal to 0x20

Greater than 0x7F

Equal to 0x2C (‘,’)

Protocol MIDI 2.0 Protocol (with a fallback to MIDI 1.0 Protocol)
Function 1
Function 1 Name “Monosynth” UTF-8, Max Length: 98 bytes
Function 1 Channels Needed 1 Channel at a time used bidirectionally

 


 

String Values

 

 The string values in this table will be used in USB descriptors, UMP messages, and MIDI-CI messages. Each of these systems have limitations that should be adhered to. The table above provides a set of rules that best suits all strings used in a new device.

 

  • USB String Descriptors use UNICODE UTF16LE encodings, not NULL-terminated up to 254 characters
  • UMP Endpoint Name Notification and Function Block Name Notification Messages are UTF-8 up to 98 characters
  • UMP Product Instance Id Notification Message is ASCII up to 42 characters.

It is recommended that Product Instance Id is used as the USB iSerial value. For compatibility with Windows it is suggested iSerial numbers don’t contain characters that are:

  • Less than or equal to 0x20 (‘ ‘)
  • Greater than 0x7F
  • Equal to 0x2C (‘,’)

 


 

USB Descriptors

 

In USB MIDI 1.0, devices can present a USB IN Data Endpoint and/or a USB OUT Data Endpoint with up to 16 virtual MIDI cables each. Note that USB IN and USB OUT refer to data direction from the point of view of the USB Host. Each virtual MIDI cable can be considered equivalent to a MIDI DIN jack with streaming MIDI 1.0 for up to 16 channels each. In USB MIDI 2.0, the device can present a single USB IN Data Endpoint and/or a single USB OUT Data Endpoint that represents a single Universal MIDI Packet (UMP) data stream. It is strongly recommended that an IN/OUT Data Endpoint pair is presented to create a bi-directional UMP Endpoint to fully take advantage of MIDI 2.0.

 

At this point we start building the USB Descriptors. Developers should ensure that the following fields are filled out with the information above.

 

When defining the USB descriptors we can see that this Device only needs to set-up a single Interface.

 

Note some USB details in this document use an id reference for a separate string descriptor. The strings are shown for brevity. If the value is omitted, the id should be set to 0, not to an entry with a blank string.

 

Detail Product Detail Value Id of String Descriptor Referenced Value
iManufacturer Manufacturer Name “ACME Enterprises”
iProduct Product Name “ACMESynth”
iSerialNumber Product Instance Id “ABCD12345”
iInterface Model Name* “ACMESynth”

 

*More complicated setups with multiple interfaces (and multiple UMP endpoints) will be discussed in a followup article. For a simple Device the iInterface and the iProduct can be the same.

 

 

 

 


 

MIDI 1.0 Class Specific Descriptors (on Alternate Setting 0)

 

A USB MIDI 2.0 Device should include a set of USB MIDI 1.0 Class Descriptors so that when it is plugged into a Host which does not understand MIDI 2.0, it can operate as a MIDI 1.0 Device.

When declaring MIDI 1.0 Class Specific Descriptors, you should provide a text string name for all Embedded MIDI Jacks.

 

Detail Product Detail Value Id of String Descriptor Referenced Value
iJack Function 1 Name “Monosynth”

 

Most Host MIDI 1.0 Class drivers do not collect information about the topology inside the MIDI Function, such as External MIDI Jacks or Elements.

 


 

MIDI 2.0 Descriptors (on Alternate Setting 1)

 

For the best compatibility on OS’s, each UMP Endpoint is represented by a single Interface. The Interface has an In and an Out USB Endpoint that represents a bidirectional UMP Endpoint.

Devices expose their functions and topology using Function Blocks regardless of transport. USB MIDI 2.0 has the additional requirement of Group Terminal Blocks which are declared in the Device descriptors and designed in consideration of the device’s Function Blocks.

Each USB Endpoint declares the Group Terminal Block Id’s used. The USB Device has a list of Group Terminal Block descriptors that match these Id’s.

In our example, the Monosynth function only uses one channel, so we only need to declare the use of one UMP Group. While there are different ways of declaring Group Terminal Blocks we will look at just one way first and revisit with different configurations with the pros and cons of each at another time.

Option 1: Declare a Single Group Terminal Block on Group 1 with a length of 1 Group

For simple Devices like our Monosynth that only connect over USB this may be the most straightforward way of connecting to a computer and provides the best backwards compatibility to MIDI 1.0 Applications.

The Group Terminal Block should have the following settings:

 

Detail Value String Rules
bGrpTrmBlkID Block Id 1
bGrpTrmBlkType Block Type 0x00 – bidirectional
nGroupTrm Starting Group 0x00 – Group 1
nNumGroupTrm Number of Groups Spanned 1
iBlockItem Function 1 Name Id of String Descriptor Referenced Value – “Monosynth”
bMIDIProtocol Block Protocol* 0x11 (MIDI 2.0 Protocol)

 

 USB Endpoints under the Interface should have the following values:

 

Detail Value String Rules
bNumGrpTrmBlock Number of GTB’s 1
baAssoGrpTrmBlkID Block Id 1

 

With the descriptors now defined, let’s observe the interaction with an OS that supports MIDI 2.0.

While MIDI 2.0 Protocol is declared in the Monosynth Group Terminal Block, a host Application may send MIDI 1.0 Channel Voice Message either intentionally or accidentally. In UMP 1.1 Stream Configuration messages may also be used to switch Protocols. To ensure the best compatibility with incoming messages a MIDI 2.0 Device supporting MIDI 2.0 Protocol should also handle and process MIDI 1.0 Channel Voice messages. We will discuss handling this in a follow up article.

 


 

OS’s That Support MIDI 2.0

 

Accessing MIDI 2.0 Devices in Software

 

MIDI 2.0 applications should, where possible, connect to the connection labelled “MIDI 2.0” (as seen below). MIDI 1.0 applications are generally only able to connect to the “Monosynth” connection and talk using MIDI 1.0 Protocol. The OS converts these MIDI 1.0 byte stream messages to UMP format between the Device and the application.

 


 

MAC OSX 14+

 

 

 

 

In Mac OSX 14+ (developer release) this looks like the following:

 

 

 

 

 

 

 

Hint: More detailed information can be seen in Mac OSX MIDI Studio by selecting List View. 

 

 

 

 

 

 

Note: OSX has supported USB MIDI 2.0 since OSX 11. Prior to OSX 14 (developer release) only the “Monosynth” entity is shown.

 


 

Linux (6.5+)

 

In Linux (upcoming Kernel 6.5, ALSA 1.2.x+) this shows up as:

 

 

 

 


 

Android 13+

 

Android 13+ currently connects to the USB MIDI Interface and can use either the USB MIDI 1.0 function on Alternate Setting #0 or use the USB MIDI 2.0 function on Alternate Setting #1 on a per application basis. When apps call midiManager.getDevicesForTransport( MidiManager.TRANSPORT_UNIVERSAL_MIDI_PACKETS), they see the USB MIDI 2.0 device as a MIDI 2.0 device. If they call midiManager.getDevicesForTransport( MidiManager.TRANSPORT_MIDI_BYTE_STREAM), they see the device as a MIDI 1.0 device. A device can be opened as only one of the two modes at once.

 

See https://developer.android.com/reference/android/media/midi/package-summary for more information. https://github.com/android/midi-samples contains some sample applications developers can test with.

 


 

OS’s That Don’t (Currently) Support MIDI 2.0

 

In OS’s that don’t yet support MIDI 2.0, the USB MIDI 1.0 function may be loaded to expose MIDI 1.0 Ports as declared in the MIDI 1.0 Class Specific Descriptors (on Alternate Setting 0). Currently in Windows this looks like:

 

 

 

 

While in current versions of Linux this looks like:

 

 

 

 


 

Where to next…?

 

These USB settings form the beginnings of a USB MIDI 2.0 Device.
In the next part of this series we look at recommended UMP Endpoint messages and how Function Blocks interact with Group Terminal Blocks to extend the usability of MIDI 2.0. We will look at other options for having more functions in your Device and how best to support them.

 


 

Detailed timeline of MIDI 2.0 developments since January, 2020


The core specifications for MIDI 2.0 were adopted by the AMEI and The MIDI Association in January of 2020. Since that time the MIDI Association and its members have been developing tools and prototyping the first MIDI 2.0 products. As a result of our prototyping, we have improved and enhanced the specifications. 

This article is simply a list of the steps that have led up to the release of a major update to the core MIDI 2.0 specifications in June 2023. 



...

02/20/2020 
The MMA adopts 8 new MIDI 2.0 specifications –  

The MIDI Manufacturers Association adopted 5 core MIDI 2.0 specifications on February 20, 2020.

MIDI 2.0 Specification Overview This document defines the specific collection of MMA/AMEI specifications that collectively comprise the core MIDI 2.



...

MIDI 2.0 Progress Continues with Updated USB Specification –  

As computers have become central components in many MIDI systems, USB has become the most widely used protocol for transporting MIDI data. With the introduction of MIDI 2.0, the USB Implementers Forum’s USB MIDI 2.0 working group, headed by members o



...

10/25/2021
MIDI Messages | Apple Developer Documentation

Apple releases the first version of their operating system that supports MIDI 2.0.



...

06/01/2022
The MIDI Association Announces MIDI 2.0 Over A2B™ –  

Los Angeles, CA – June 1, 2022.The MIDI Association announced today the addition of MIDI 2.0 capabilities to the Automotive Audio Bus (A2B®) from Analog Devices, Inc. The technology was showcased at the MIDI Association booth #10300 at the NAMM show




...

08/17/2022
The MIDI Association Announces MIDI 2.0 Development Tools –  

MIDI Association announces global initiative by over 50 MIDI Association companies to prototype MIDI 2.0. The MIDI Association has released details of the ongoing, industry-wide initiative by over 50 MIDI Association companies to deve



...

10/01/ 2022
AMEI to Fund Open Source MIDI 2.0 Driver for Windows –  

November 1, 2022 – The Association of Musical Electronics Industries (AMEI), the organization that oversees the MIDI specification in Japan, has committed to funding the development of an open-source USB MIDI 2.0 Host Driver for Windows Operating Sys



...

11/16/2022
The MIDI Association at ADC 2022 –  

Apple, Google and Microsoft present Implementations of MIDI 2.0 at Audio Developers Conference 2022.



MIDI is about collaboration, not competition

All kinds of companies, all kinds of devices One of the things that has always made MIDI unique in the world of standards is that no one owns MIDI and the MIDI Associations (AMEI in Japan and The MIDI Association in the rest of the world) don’t sell anything.  We (AMEI and The MIDI Association) get companies to volunteer their s…


https://www.midi.org/midi-articles/midi-is-about-collaboration-not-competition




New Windows MIDI services Spring 2023 update

Check out Pete Brown’s post on the Microsoft Dev blog for important updates on MIDI 2.0  Pete is not just the designated representative to the MIDI Association for Microsoft, he is also the current chair of our Executive Board and an incredible evangelist for MIDI and particularly for MIDI 2.0.  Here are some of the topics …


https://www.midi.org/midi-articles/new-windows-midi-services-spring-2023-update



...

Details about MIDI 2.0, MIDI-CI, Profiles and Property Exchange (Updated June, 2023) –  

This article is for companies looking to develop MIDI 2.0 products, both software developers and hardware manufacturers. If you are a MIDI user looking for the benefits of MIDI 2.0, go to this article, which is a more general overview of MIDI 2.0 fea



What Musicians & Artists need to know about MIDI 2.0

This article is to explain the benefits of MIDI 2.0 to people who use MIDI.  If you are a MIDI developer looking for the technical details about MIDI 2.0, go to this article updated to reflect the major updates published to the core MIDI 2.0 specs in June 2023.  MIDI 2.0 Overview Back in 1983, musical instrument companies that c…


https://www.midi.org/midi-articles/what-musicians-and-artists-need-to-know-about-midi-2-0


What Musicians & Artists need to know about MIDI 2.0


This article is to explain the benefits of MIDI 2.0 to people who use MIDI.

If you are a MIDI developer looking for the technical details about MIDI 2.0, go to this article updated to reflect the major updates published to the core MIDI 2.0 specs in June 2023. 

The following movie explains the basics of MIDI 2.0 in simple language.  



MIDI 2.0 Overview

Music is the universal language of human beings and MIDI is the universal digital language of music

Back in 1983, musical instrument companies that competed fiercely against one another nonetheless banded together to create a visionary specification—MIDI 1.0, the first universal Musical Instrument Digital Interface.

Nearly four decades on, it’s clear that MIDI was crafted so well that it has remained viable and relevant. Its ability to join computers, music, and the arts has become an essential part of live performance, recording, smartphones, and even stage lighting.

Now, MIDI 2.0 takes the specification even further, while retaining backward compatibility with the MIDI 1.0 gear and software already in use. MIDI 2.0 is the biggest advance in music technology in 4 decades. It offers many new features and improvements over MIDI 1.0, such as higher resolution, bidirectional communication, dynamic configuration, and enhanced expressiveness. 


MIDI 2.0 Means Two-way MIDI Conversations

MIDI 1.0 messages went in one direction: from a transmitter to a receiver. MIDI 2.0 is bi-directional and changes MIDI from a monologue to a dialog. With the new MIDI-CI (Capability Inquiry) messages and UMP EndPoint Device Discovery Messages, MIDI 2.0 devices can talk to each other, and auto-configure themselves to work together.

They can also exchange information on functionality, which is key to backward compatibility—MIDI 2.0 gear can find out if a device doesn’t support MIDI 2.0, and then simply communicate using MIDI 1.0. 


MIDI 2.0 Specs are mostly for MIDI developers, not MIDI users

If you are a MIDI user trying to read and make sense of many of the new MIDI 2.0 specs, MIDI 2.0 may seem really complicated. 

Yes, it actually is more complicated because we have given hardware and software MIDI developers and operating system companies the ability to create bi-directional MIDI communications between devices and products. 

MIDI 2.0 is much more like an API (application programming interface, a set of functions and procedures allowing the creation of applications that access the features or data of an operating system, application, or other service) than a simple one directional set of data messages like MIDI 1.0. 

Just connect your MIDI gear exactly like you always have and then the operating systems, DAWs and MIDI applications take over and try to auto-configure themselves using MIDI 2.0. 

If they can’t then they will work exactly like they do currently with MIDI 1.0. 

If they do have mutual  MIDI 2.0 features, then these auto-configuration mechanisms will work and set up your MIDI devices for you. 


MIDI 2.0 works harder so you don’t have to. 

Just Use MIDI

As you can see the only step that MIDI users really have to think about is Step 7 -Use MIDI

MIDI 2.0 expands MIDI to 256 Channels in 16 Groups so you will start to see applications and products that display Groups, but these are not so different than the 16 Ports in USB MIDI 1.0.

We have tried very hard to make it simple for MIDI users, but as any good developer will tell you – making it easy for users often makes more work for developers. 


MIDI-CI Profile Configuration

At Music China 2023, there were a number of public presentations of recent MIDI specifications that the MIDI Association has been working on.  

Joe Shang from Medeli who is on the MIDI Association Technical Standards board put it very well at the International MIDI Forum at Music China. 

He said that with the recent updates published in June 2023, MIDI 2.0 had a strong skeleton, but now we need to put muscles on the bones. He also said that Profiles are the muscles we need to add. 

He is right. This will be “The Year Of Profiles” for The MIDI Association. 

We have now adopted 7 Profiles. 

  • MIDI-CI Profile for General MIDI 2 (GM2 Function Block Profile)
  • MIDI-CI Profile for for General MIDI 2 Single Channel (GM2 Melody Channel)
  • MIDI-CI Profile for Drawbar Organ Single Channel
  • MIDI-CI Profile for Rotary Speaker Single Channel
  • MIDI-CI Profile for MPE (Multi Channel)
  • MIDI-CI Profile for Orchestral Articulation Single Channel

We also have completed the basic design of three more Profiles.

  • MIDI-CI Profile for Orchestral Articulation Single Channel
  • MIDI-CI Profile for Piano Single Channel
  • MIDI-CI Profile for Camera Control Single Channel

At Music China and at the meeting we had at the same time at Microsoft office in Redmond, MIDI Association and AMEI members were talking about the UDP Network transport specification that we are working on and the need to Profiles for all sorts of Effects ( Chorus, Reverb, Phaser, Distortion, etc.), Electronic Drums, Wind Controllers and DAW control. 

The MIDI 2.0 overview defined a defined sets of rules for how a MIDI device sends or responds to a specific set of MIDI messages to achieve a specific purpose or suit a specific application.

Advanced MIDI users might be familiar with manually “mapping” all the controllers from one device to another device to make them talk to each other. Most MIDI users are familiar with MIDI Learn.

If 2 devices agree to use a common Profile, MIDI-CI Profile Configuration can auto-configure the mappings. Two devices learn what their common capabilities are and then can auto-configure themselves to respond correctly to a whole set of MIDI messages.

MIDI gear can now have Profiles that can dynamically configure a device for a particular use case. If a control surface queries a device with a “mixer” Profile, then the controls will map to faders, panpots, and other mixer parameters. But with a “drawbar organ” Profile, that same control surface can map its controls automatically to virtual drawbars and other keyboard parameters—or map to dimmers if the profile is a lighting controller. This saves setup time, improves workflow, and eliminates tedious manual programming.

Actually General MIDI was an example of what a Profile could do. 

GM was a defined set of responses to set of MIDI messages. But GM was done before the advent of the bi-directional communication enabled by MIDI-CI.  

So in the MIDI 1.0 world,  you sent out a GM On message, but you never knew if the device on the other side could actually respond to the message.  There was no dialog to establish a connection and negotiate capabilities. 

But bi-directional commmunication allows for much better negotiation of capabilities (MIDI -CI stands for Capabilities Inquiry after all).

One of the important things about Profiles is that they can negotiate a set of features like the number of Channels a Profile wants to use. Some Profiles like the Piano Profile are Single Channel Profiles and get turned on and used on any single channel you want. 

Let’s use the MPE Profile as an example.   MPE works great,  but it has no bi-directional communication for negotiation. 

With MIDI 2.0 using a mechanism called the Profile Details Inquiry message, two products can agree that they want to be in MPE Mode, agree on the number of channels that both devices can support, the number of dimensions of control that both devices support (Pitch Bend, Channel Pressure and a third dimension of control) and even if both devices support high resolution bi-polar controllers. Bi-directional negotiation just makes things work better automatically. 

Let’s consider MIDI pianos. Pianos have a lot of characteristics in common and we can control those characteristics by a common set of MIDI messages. MIDI messages used by all pianos include Note On/Off and Sustain Pedal.

But when we brought all the companies that made different kinds of piano products together (digital piano makers like Kawai, Korg and Roland,  companies like Yamaha and Steinway that make MIDI controlled acoustic pianos and softsynth companies like Synthogy that makes Ivory), we realized that each company had different velocity and sustain pedal response curves.  

We decided that if we all agreed on a Piano Profile with an industry standard velocity and pedal curve, it would greatly enhance interoperability. 

Orchestral Articulation is another great example.  There are plenty of great orchestral libraries, but each company uses different MIDI messages to switch articulations.  Some companies use notes on the bottom of the keyboard and some use CC messages.  So we came up with way to put the actual articulation messages right into the expanded fields in the MIDI 2.0 Note On message. 

The following video has a demonstration of how Profile Configuration works.



The MIDI Association adopted the first Profile in 2022, the Default Control Change Mapping Profile.

Many MIDI devices are very flexible in configuration to allow a wide variety of interaction between devices in various applications. However, when 2 devices are configured differently, there can be a mismatch that reduces interoperability.

This Default Control Change Mapping Profile defines how devices can be set to a default state, aligned with core definitions of MIDI 1.0 and MIDI 2.0. In particular, devices with this Profile enabled have the assignment of Control Change message destinations/functions set to common, default definitions.

Because there were less than 128 controllers in MIDI 1.0, even the most commonly used could be reassigned to other functions.

Turning on this Profile sets commonly used controllers such as Volume (CC7), Pan (CC-10) , Sustain (CC64), Cutoff (CC 74), Attack (CC73), Decay (CC75), Release (CC72), Reverb Depth (CC91) to their intended assignment. 

The video above included a very early prototype of the Drawbar Organ Profile and Rotary Speaker Profile.

We have just finished videos for Music China. Here are short videos for:


MIDI-CI Property Exchange

Property Exchange is a set of System Exclusive messages that devices can use discover, get, and set many properties of MIDI devices. The properties that can be exchanged include device configuration settings, a list of patches with names and other meta data, a list of controllers and their destinations, and much more.

Property Exchange can allow for devices to auto map controllers, choose programs by name, change state and also provide visual editors to DAW’s without any prior knowledge of the device or specially crafted software. This means that Devices could work on Windows, Mac, Linux, IOS and Web Browsers and may provide tighter integrations with DAW’s and hardware controllers.

Property Exchange uses JSON inside of the System Exclusive messages. JSON (JavaScript Object Notation) is a human readable format for exchanging data sets. The use of JSON expands MIDI with a whole new area of potential capabilities.



The MIDI Association has completed and published the following Property Exchange Resources.

  • Property_Exchange_Foundational_Resources
  • Property_Exchange_Mode_Resources
  • Property_Exchange_ProgramList_Resource
  • Property_Exchange_Channel_Resources
  • Property_Exchange_LocalOn_Resource
  • Property_Exchange_MaxSysex8Streams_Resource
  • Property_Exchange_Get_and_Set_Device_State
  • Property_Exchange_StateList
  • Property_Exchange_ExternalSync_Resource
  • Property_Exchange_Controller_Resources

One of the most interesting of these PE specifications is Get and Set Device State which allows for an Initiator to send or receive Device State, or in other words, to capture a snapshot which might be sent back to the Device at a later time. 

The primary goal of this application of Property Exchange is to GET the current memory of a MIDI Device. This allows a Digital Audio Workstation (DAW) or other Initiator to store the State of a Responder Device between closing and opening of a project. Before a DAW closes a project, it performs the GET inquiry and the target Device sends a REPLY with all data necessary to restore the current State at a later time. When the DAW reopens a project, the target Device can be restored to its prior State by sending an Inquiry: Set Property Data Message.

Data included in each State is decided by the manufacturer but typically might include the following properties (not an exhaustive list):

  • Current Program
  • All Program Parameters
  • Mode: Single Patch, Multi, etc.
  • Current Active MIDI Channel(s)
  • Controller Mappings
  • Samples and other binary data
  • Effects
  • Output Assignments

Essentially this will allow hardware devices to have the same amount of recallability as soft synths when using a DAW. 

There are a number of MIDI Association companies who are actively working on implementing this MIDI 2.0 Property Exchange Resource.


MIDI-CI Process Inquiry

Version 1.2 of MIDI-CI introduces a new category of MIDI-CI, Process Inquiry, which allows one device to discover the current values of supported MIDI Messages in another device including: System Messages, Channel Controller Messages and Note Data Messages

Here some use cases:

  • Query the current values of parameters which are settable by MIDI Controller messages.
  • Query to find out which Program is currently active
  • Query to find out the current song position of a sequence.

For Those Who Want To Go Deeper

 In the previous version of this article, we provide some more technical details and we will retin them here for those who want to know more, but if you are satified with knowing what MIDI 2.0 can do for you, you can stop reading here. 

MIDI Capability Inquiry (MIDI-CI) and UMP Discovery

To protect backwards compatibility in a MIDI environment with expanded features, devices need to confirm the capabilities of other connected devices. When 2 devices are connected to each other, they confirm each other’s capabilities before using expanded features. If both devices share support for the same expanded MIDI features they can agree to use those expanded MIDI features. 

The additional capabilities that MIDI 2.0 brings to devices are enabled by MIDI-CI and by new UMP Device Discovery mechanisms. 

New MIDI products that support MIDI-CI and UMP Discovery can be configured by devices communicating directly themselves.  Users won’t have to spend as much time configuring the way products work together.

Both MIDI-CI and UMP Discovery share certain common features: 

  • They separate older MIDI products from newer products with new capabilities and provides a mechanism for two MIDI devices to understand which new capabilities are supported.
  • They assume and require bidirectional communication. Once a bi-directional connection is established between devices, query and response messages define what capabilities each device has then negotiate or auto-configure to use those features that are common between the devices. 

MIDI DATA FORMATS AND ADDRESSING


MIDI 1.0 BYTE STREAM DATA FORMAT 

MIDI 1.0 originally defined a byte stream data format and a dedicated 5 pin DIN cable as the transport. When computers became part of the MIDI environment, various other transports were needed to carry the byte stream, including software connections between applications. What remained common at the heart of MIDI 1.0 was the byte stream data format.

The MIDI 1.0 Data Format defines the byte stream as a Status Byte followed by data bytes. Status bytes have the first bit set high. The number of data bytes is determined by the Status. 





Addressing in MIDI 1.0 DATA FORMAT

The original MIDI 1.0 design had 16 channels. Back then synthesizers were analog synths with limited polyphony (4 to 6 Voices) that were only just starting to be controlled by microprocessors. 

In MIDI 1.0 byte stream format, the value of the Status Byte of the message determines whether the message is a System Message or a Channel Voice Message. System Messages are addressed to the whole connection. Channel Voice Messages are addressed to any of 16 Channels.




Addressing in USB MIDI 1.0  DATA FORMAT 

In 1999 when the USB MIDI 1.0 specification was adopted, USB added the concept of a multiple MIDI ports. You could have 16 ports each with its own 16 channels on a single USB connection.




The Universal MIDI Packet (UMP) Format

The Universal MIDI Packet (UMP) Format, introduced as part of MIDI 2.0, uses a packet-based data format instead of a byte stream. Packets can be 32 bits, 64 bits, 96 bits, or 128 bits in size.

This format, based on 32 bit words, is more friendly to modern processors and systems than the byte stream format of MIDI 1.0. It is well suited to transports and processing capabilities that are faster and more powerful than those when MIDI 1.0 was introduced in 1983. 

More importantly, UMP can carry both MIDI 1.0 protocol and MIDI 2.0 protocol.  It is called a Universal MIDI Packet because it handles both MIDI 1.0 and MIDI 2.0 and is planned to be used for all new transports defined by the MIDI Association including the already updated USB MIDI 2.0 specification and the Network Transport specification that we are currently working on. 




Addressing in UMP FORMAT 

The Universal MIDI Packet introduces an optional Group field for messages. Each Message Type is defined to be addressed with a Group or without a Group field (“Groupless”). 




Channels, Groups and Groupless Messages in UMP

These mechanisms expand the addressing space beyond that of MIDI 1.0. 

Groupless Messages are addressed to the whole connection. Other messages are addressed to a specific Group, either as a System message for that whole Group or to a specific Channel within that Group. 



UMP continues this step by step expansion of MIDI capabilities while maintaining the ability to map back to MIDI products from 1983. 

UMP carries 16 Groups of MIDI Messages, each Group containing an independent set of System Messages and 16 MIDI Channels. Therefore, a single connection using the Universal MIDI Packet carries up to 16 sets of System Messages and up to 256 Channels.

Each of the 16 Groups can carry either MIDI 1.0 Protocol or MIDI 2.0 Protocol. Therefore, a single connection can carry both protocols simultaneously. MIDI 1.0 Protocol and MIDI 2.0 Protocol messages cannot be mixed together within 1 Group. 

Groups are slightly different than Ports, but for compatibility with legacy 5 PIN DIN, a single 16 channel Group in UMP can be easily mapped back to a 5 PIN DIN Port or to a Port in USB MIDI. 

You will soon start to see applications which offer selection for Groups and Channels. 

The newest specifications in June 2023 add the concept of Groupless Messages and Function Blocks. 

Groupless Messages are used to discover details about a UMP Endpoint and its Function Blocks. 

Some Groupless Messages are passed to operating systems and applications which use them to provide you with details of what functions exist in the MIDI products you have. 

Now a MIDI Device can declare that Groups 1,2,3,and 4 are all used for a single function for 96 Channels (for example a mixer or a sequencer).  

All of these decisions had to be made very carefully to ensure that everything would map back and work seamlessly with MIDI 1.0 products from 1983. 


UMP Discovery

The UMP Format defines mechanisms for Devices to discover fundamental properties of other Devices to connect, communicate and address messages. Discoverable properties include:

1. Device Identifiers: Name, Manufacturer, Model, Version, and Product Instance Id (e.g. unique identifier).

2. Data Formats Supported: Version of UMP Format (necessary for expansion in the future), MIDI Protocols, and whether Jitter Reduction Timestamps can be used.

3. Device Topology: including which Groups are currently valid for transmitting and receiving messages and which Groups are available for MIDI-CI transactions.

These properties can be used for Devices to auto-configure through bidirectional transactions, thereby enabling the best connectivity between the Devices. These properties can also provide useful information to users for manual configuration. 


UMP handles both MIDI 1.0 and MIDI 2.0 Protocols



 
A MIDI Protocol is the language of MIDI, or the set of messages that MIDI uses. Architectural concepts and semantics from MIDI 1.0 are the same in the MIDI 2.0 Protocol. Compatibility for translation to/from MIDI 1.0 Protocol is given high priority in the design of MIDI 2.0 Protocol.
 
In fact, Apple has used MIDI 2.0 as the core data format for Core MIDI with hi resolution 16 bit velocity and 32 bit controllers since the Monterey OS was released in 2021. So if you have an Apple computer or iOS device, you probably already have MIDI 2.0. in your operating system. Apple has taken care of detecting that when you plug in a MIDI 1.0 device, the Apple operating system translated MIDI 2.0 messages into MIDI 1.0 messages so you can just keep making music.
 
This seamless integration of MIDI 1.0 and MIDI 2.0 is the goal of the numerous implementations that have been released or are under development. Google has added MIDI 2.0 protocol to Android in Android 13, Analog Devices has added it to their A2B network. Open Source ALSA implementations for Linux and Microsoft Windows drivers/APIs are expected to be released later this year. 
 
One of our main goals in the MIDI Association is to bring added possibilities to MIDI without breaking anything that already works and making sure that MIDI 1.0 devices work smoothly in a MIDI 2.0 environment. 
 
The MIDI 1.0 Protocol and the MIDI 2.0 Protocol have many messages in common and messages that are identical in both protocols. 
 
The MIDI 2.0 Protocol extends some MIDI 1.0 messages with higher resolution and new features. There are newly defined messages. Some can be used in both protocols and some are exclusive to the MIDI 2.0 Protocol. 
 
New UMP messages allow one device to query what MIDI protocols another device supports and they can mutually agree to use a new protocol.  
 
In some cases (the Apple example above is a good one),  an operating system or an API might have additional means for discovering or selecting Protocols and JR Timestamps to fit the needs of a particular MIDI system.

MIDI 2.0 Protocol- Higher Resolution, More Controllers and Better Timing

The MIDI 2.0 Protocol uses the architecture of MIDI 1.0 Protocol to maintain backward compatibility and easy translation while offering expanded features.

  • Extends the data resolution for all Channel Voice Messages.
  • Makes some messages easier to use by aggregating combination messages into one atomic message.
  • Adds new properties for several Channel Voice Messages.
  • Adds several new Channel Voice Messages to provide increased Per-Note control and musical expression.
  • Adds New data messages include System Exclusive 8 and Mixed Data Set. The System Exclusive 8 message is very similar to MIDI 1.0 System Exclusive but with 8-bit data format. The Mixed Data Set Message is used to transfer large data sets, including non-MIDI data.
  • Keeps all System messages the same as in MIDI 1.0.

Expanded Resolution and Expanded Capabilities

This example of a MIDI 2.0 Protocol Note message shows the expansions beyond the MIDI 1.0 Protocol equivalent. The MIDI 2.0 Protocol Note On has higher resolution Velocity. The 2 new fields, Attribute Type and Attribute data field, provide space for additional data such as articulation or tuning details 




Easier to Use: Registered Controllers (RPN) and Assignable Controllers (NRPN)

Creating and editing RPNs and NRPNs with MIDI 1.0 Protocol requires the use of compound messages. These can be confusing or difficult for both developers and users. MIDI 2.0 Protocol replaces RPN and NRPN compound messages with single messages. The new Registered Controllers and Assignable Controllers are much easier to use.

The MIDI 2.0 Protocol replaces RPN and NRPN with 16,384 Registered Controllers and 16,384 Assignable Controller that are as easy to use as Control Change messages.

Managing so many controllers might be cumbersome. Therefore, Registered Controllers are organized in 128 Banks, each Bank having 128 controllers. Assignable Controllers are also organized in 128 Banks, each Bank having 128 controllers.

Registered Controllers and Assignable Controllers support data values up to 32bits in resolution.




MIDI 2.0 Program Change Message

MIDI 2.0 Protocol combines the Program Change and Bank Select mechanism from MIDI 1.0 Protocol into one message. The MIDI 1.0 mechanism for selecting Banks and Programs requires sending three MIDI messages. MIDI 2.0 changes the mechanism by replicating the Banks Select and Program Change in one new MIDI 2.0 Program Change message. Banks and Programs in MIDI 2.0 translate directly to Banks and Programs in MIDI 1.0.




Built for the Future

MIDI 1.0 is not being replaced. Rather it is being extended and is expected to continue, well integrated with the new MIDI 2.0 environment. It is part of the Universal MIDI Packet, the fundamental MIDI data format. 

In the meantime, MIDI 1.0 works really well. In fact, MIDI 2.0 is just more MIDI. As new features arrive on new instruments, they will work with existing devices and system. The same is true for the long list of other additions made to MIDI since 1983. MIDI 2.0 is just part of the evolution of MIDI that has gone on for 36 years. The step by step evolution continues.

Many MIDI devices will not need any of the new features of MIDI 2.0 in order to perform all their functions. Some devices will continue to use the MIDI 1.0 Protocol while using other extensions of MIDI 2.0, such as Profile Configuration, Property Exchange or Process Inquiry.  

MIDI 2.0 is the result of a global, decade-long development effort. 

Unlike MIDI 1.0, which was initially tied to a specific hardware implementation, a new Universal MIDI Packet format makes it easy to implement MIDI 2.0 on any digital transport. MIDI 2.0 already runs on USB, Analog Devices A2b Bus and we are working on an network transport spec. 

To enable future applications that we can’t envision today, there’s ample space reserved for brand-new MIDI messages.

Further development of the MIDI specification, as well as safeguards to ensure future compatibility and growth, will continue to be managed by the MIDI Manufacturers Association working in close cooperation with the Association of Musical Electronics Industry (AMEI), the Japanese trade association that oversees the MIDI specification in Japan.

MIDI will continue to serve musicians, DJs, producers, educators, artists, and hobbyists—anyone who creates, performs, learns, and shares music and artistic works—in the decades to come.


MIDI 2.0 FAQs


We have been monitoring the comments on a number of websites and wanted to provide some FAQs about MIDI 2.0 as well as videos of some requested MIDI 2.0 features.

Will MIDI 2.0 devices need to use a new connector or cable?

No, MIDI 2.0 is a transport agnostic protocol.

  • Transport- To transfer or convey from one place to another
  • Agnostic- designed to be compatible with different devices
  • Protocol-a set of conventions governing the treatment and especially the formatting of data in an electronic communications system

That’s engineering speak for MIDI 2.0 is a set of messages and those messages are not tied to any particular cable or connector.

When MIDI first started it could only run over the classic 5 Pin DIN cable and the definition of that connector and how it was built was described in the MIDI 1.0 spec.

However soon the MIDI Manufacturers Association and Association of Music Electronic Industries defined how to run MIDI over many different cables and connectors.

So for many years, MIDI 1.0 has been a transport agnostic protocol..

MIDI 1.0 messages currently run over 5 PIN Din, serial ports, Tip Ring Sleeve 1/8″ cables, Firewire, Ethernet and USB transports.

Can MIDI 2.0 run over those different MIDI 1.0 transports now?

Yes, MIDI 2.0 products can use MIDI 1.0 protocol and even use 5 Pin DIN if they support the Automated Bi-Directional Communication of MIDI-CI and :

  • One or more Profiles controllable by MIDI-CI Profile Configuration messages.
  • Any Property Data exchange by MIDI-CI Property Exchange messages.
  • Any Process Inquiry exchange by MIDI-CI Process Inquiry messages.

However to run the Universal MIDI Packet and take advantage of MIDI 2.0 Voice Channel messages with expanded resolution, there needs to be new specifications written for each transport.

The new Universal Packet Format that will be common to all new transports defined by AMEI and The MIDI Associaton. The new Universal Packet contains both MIDI 1 .0 messages and MIDI 2.0 Voice Channel Messages plus some messages that can be used with both.

The most popular MIDI transport today is USB. The vast majority of MIDI products are connected to computers or hosts via USB.

The USB specification for MIDI 2.0 is the first transport specification completed, but we are working on a UMP Network Transport for Ethernet and Wireless Connectivity

Can MIDI 2.0 provide more reliable timing?

Yes. Products that support the new USB MIDI Version 2 UMP format can provide higher speed for better timing characteristics. More data can be sent between devices to greatly lessen the chances of data bottlenecks that might cause delays.

UMP format also provides optional “Jitter Reduction Timestamps”. These can be implemented for both MIDI 1.0 and MIDI 2.0 in UMP format.

With JR Timestamps, we can mark multiple Notes to play with identical timing. In fact, all MIDI messages can be tagged with precise timing information. This also applies to MIDI Clock messages which can gain more accurate timing.

Goals of JR Timestamps:

  • Capture a performance with accurate timing
  • Transmit MIDI message with accurate timing over a system that is subject to jitter
  • Does not depend on system-wide synchronization, master clock, or explicit clock synchronization between Sender and Receiver.

Note: There are two different sources of error for timing: Jitter (precision) and Latency (sync). The Jitter Reduction Timestamp mechanism only addresses the errors introduced by jitter. The problem of synchronization or time alignment across multiple devices in a system requires a measurement of latency. This is a complex problem and is not addressed by the JR Timestamping mechanism.

Also we have added Delta Time Stamps to the MIDI Clip File Specification.

Can MIDI 2.0 provide more resolution?

Yes, MIDI 1.0 Voice Channel messages are usually 7 bit (14 bit is possible by not so widely implemented because there are only 128 CC messages).

With MIDI 2.0 Voice Channel Messages velocity is 16 bit.

The 128 Control Change messages, 16,384 Registered Controllers, 16,384 Assignable Controllers, Poly and Channel Pressure, and Pitch Bend are all 32 bit resolution.

Can MIDI 2.0 make it easier to have microtonal control and different non-western scales?

Yes, MIDI 2.0 Voice Channel Messages allow Per Note precise control of the pitch of every note to better support non-western scales, arbitrary pitches, note retuning, dynamic pitch fluctuations or inflections, or to escape equal temperament when using the western 12 tone scale.


New Windows MIDI services Spring 2023 update


Check out Pete Brown’s post on the Microsoft Dev blog for important updates on MIDI 2.0


 Pete is not just the designated representative to the MIDI Association for Microsoft, he is also the current chair of our Executive Board and an incredible evangelist for MIDI and particularly for MIDI 2.0. 

Here are some of the topics he covers in his blog article targeted to developers. 

  • Where the Standards are Today
  • The OS API Working Group
  • MIDI 2.0 Implementation Guide
  • Cross-company Cooperation and the Joint AMEI Meeting in Tokyo
  • MIDI 2.0 at the NAMM Show in Anaheim
  • Data Formats
  • Protocol vs Data Format
  • UMP Endpoint vs Port
  • Windows MIDI Services Update

If you develop MIDI products and applications for Windows, you will want to read the full article on the Microsoft dev blog, but to tantalize you a bit here is a graphic from Pete’s post. 




MIDI is about collaboration, not competition


All kinds of companies, all kinds of devices

One of the things that has always made MIDI unique in the world of standards is that no one owns MIDI and the MIDI Associations (AMEI in Japan and The MIDI Association in the rest of the world) don’t sell anything. 

We (AMEI and The MIDI Association) get companies to volunteer their staff to work on really complex problems (like MIDI 2.0), work together to solve those problems and once those problems are solved, we give away the solutions and specifications for free so anyone can use them to make MIDI products. 

MIDI is also unique because it enables all kinds of instruments – keyboards, drums, wind controllers, pad controllers, lights, and in fact anything that can be controlled digitally can be controlled by MIDI.

There really is no other standards that works quite the same way and it is what makes MIDI so very special. 


MIDI 2.0 meeting in Japan March 27,28, 29, 2023

A great example of the collaboration is the meeting that happened in Japan recently. 

All of the major OS companies gathered to talk about details of handling MIDI 2.0 in operating systems.  We understand that people are waiting (some not so patiently) for MIDI 2.0,   But there was one thing that was clear to both AMEI and The MIDI Association since the very beginning of discussions about MIDI 2.0- we couldn’t break MIDI 1.0! 

 The spirit of cooperation between these OS companies and musical instrument manufacturers was  awe inspiring. 

Companies set aside their differences and competitive natures to the goal of the meeting was to cooperate together for the greater benefit of musicians around the world. 

In a remarkable coincidence, the presentation by the OS companies that was made at Audio Developer Conference 2022 was put up on Youtube by ADC on the second day of the meeting in Japan. 


Audio Developer Conference 2022 video featuring Pete Brown, Phil Burk, Torrey Holbrook Walker & Mike Kent available 

Engineers from Apple, Google, and Microsoft will present the current state of MIDI 2.0 implementations in their operating systems. 

We’ll describe the API changes required for MIDI 2.0 for each platform as well as discuss the philosophy and reasoning behind various design decisions. We’ll also present the status of transports, such as USB and Ethernet. 

If you’re a developer who is interested in the practical implementations of MIDI 2.0, this is the session for you. 

Pete Brown 

Pete works in the Windows + Devices org in Microsoft, primarily focusing on partners, apps, and technology for musicians. He’s the lead for the Windows MIDI Services project which is bringing an updated MIDI stack to Windows, and adding full MIDI 2.0 support. 

He also serves as the current Chair of the Executive Board of the MIDI Association. 

When not working, he enjoys synthesizers, electronics, woodworking, astrophotography, 3d Printing, and CNC projects at his home on the east coast of the US. _ Phil Burk Music and audio software developer. I

nterested in compositional tools and techniques, synthesis, and real-time performance on Android. Worked on HMSL, JForth, 3DO, PortAudio, JSyn, WebDrum, ListenUp, Sony PS3, Syntona, ME3000, Android MIDI, AAudio, Oboe and MIDI 2.0. 

Torrey Holbrook Walker 

I am a senior software framework engineer on the Core Audio team at Apple and a frequent MIDI specification contributor and prototyper with the MIDI Association. I have been passionate about creating music production technologies that delight audio software developers, musicians, and music producers over my 16-year career with Apple. 

You should talk to me about: 

– MIDI 2.0 software implementations or USB MIDI 2.0 hardware support. 

– CoreMIDI APIs, USB MIDI, or BLE MIDI. 

– The best food in London. 

– Analog high-end audio and vinyl. 

– Autechre, Aphex Twin, Squarepusher, and any wonky music with a shedload of bass. 

Mike Kent 

Mike Kent is the Co-Founder and Chief Strategy Officer of AmeNote Inc. Mike is a world leader in technology for musical instruments and professional audio/video. 

Mike is the Chair of the MIDI 2.0 Working Group of the MIDI Association. He is a co-author of USB MIDI 1.0, the principal architect of USB MIDI 2.0, and Chair of the MIDI Working Group of the USB Implementers Forum.  


ADC 2022 OS Support for MIDI 2.0


The MIDI Association at ADC 2022

The MIDI Association will be presenting two very important sessions at the Audio Developers Conference in 2022 

There are two very important MIDI Association sessions for developers at ADC 2022 described below.

Those sessions are important because since adopting the core MIDI 2.0 specifications in January of 2020, we have been working hard to develop the infrastructure necessary and then prototype those MIDI 2.0 specs. 

During that process we discussed and agreed on some significant improvements to the core specifications that are in the final process of review and voting by the MIDI Association and AMEI. 

Perhaps most obvious example is that one of the three P’s of Profiles Configuration, Property Exchange and Protocol Negotiation will change. 

We plan to deprecate Profile Negotiation via MIDI-CI in favor of a simpler method accomplished by a new UMP message. 

But there are still three P’s because we created a new P called Process Inquiry. 

As the MIDI Association, we share with AMEI the awesome responsibility of advancing MIDI while maintaining compatibility with MIDI products made all the back in 1983.  So we are sure that everyone appreciates that we need to take the time to test things completely before releasing MIDI 2.0 products. 

We will be sharing an overview of those changes both at ADC and in this core MIDI 2.0 article. 


...

Details about MIDI 2.0™, MIDI-CI, Profiles and Property Exchange –  

The core MIDI 2.0 Specifications are available for download by MIDI Association Members You must be logged in as a TMA member to download the spec. 


Tuesday, November 15 • 3:00pm – 3:50pm GMT

     Florian Bomers


MIDI 2.0 extends MIDI in many ways: more channels, higher resolution, jitter reduction, auto-configuration via bidirectional transactions.

Core members of the MIDI Association present an overview of the available tools for developing, debugging, and deploying MIDI 2.0 products.

A number of tools have been developed to jump-start prototyping and validation of UMP functions and fuel the transition to MIDI 2.0. These tools include software applications for implementing and debugging UMP software and hardware, and testing MIDI-CI implementations. All tools will be shown in action and basic usage will be explained.

Moderated by Florian Bomers (MIDI 2.0 Ethernet Transport WG Chair) and other MIDI Association members


Wednesday, November 16 • 9:00am – 9:50am 

Engineers from Apple, Google, and Microsoft will present the current state of MIDI 2.0 implementations in their operating systems. We’ll describe the API changes required for MIDI 2.0 for each platform as well as discuss the philosophy and reasoning behind various design decisions. We’ll also present the status of transports, such as USB and Ethernet. If you’re a developer who is interested in the practical implementations of MIDI 2.0, this is the session for you.


Moderated by Mike Kent (MIDI 2.0 WG Chair) and other MIDI Association members including Apple, Google and Microsoft

Mike Kent

Mike Kent Chief Strategy Officer, AmeNote Inc.

Mike Kent is the Co-Founder and Chief Strategy Officer of AmeNote Inc. Mike is a world leader in technology for musical instruments and professional audio/video. Mike is the Chair of the MIDI 2.0 Working Group of the MIDI Association. He is a co-author of USB MIDI 1.0, the principal architect of USB MIDI 2.0, and Chair of the MIDI Working Group of the USB Implementers Forum.

Pete Brown

Pete Brown Principal Software Engineer, Windows, Microsoft

Pete works in the Windows + Devices org in Microsoft, primarily focusing on partners, apps, and technology for musicians. He’s the lead for the Windows MIDI Services project which is bringing an updated MIDI stack to Windows, and adding full MIDI 2.0 support. 

He’s also serves as the current Chair of the Executive Board of the MIDI Association.

When not working, he enjoys synthesizers, electronics, woodworking, astrophotography, 3d Printing, and CNC projects at his home on the east coast of the US.

Phil Burk

Phil Burk Staff Software Engineer, Google Inc


Music and audio software developer. Interested in compositional tools and techniques, synthesis, and real-time performance on Android. Worked on HMSL, JForth, 3DO, PortAudio, JSyn, WebDrum, ListenUp, Sony PS3, Syntona, ME3000, Android MIDI, AAudio, Oboe and MIDI 2.0.

Torrey Holbrook Walker

Torrey Holbrook Walker Audio/MIDI Framework Engineer, Apple Inc.

I am a senior software framework engineer on the Core Audio team at Apple and a frequent MIDI specification contributor and prototyper with the MIDI Association. I have been passionate about creating music production technologies that delight audio software developers, musicians, and music producers over my 16-year career with Apple

You should talk to me about:
– MIDI 2.0 software implementations or USB MIDI 2.0 hardware support.
– CoreMIDI APIs, USB MIDI, or BLE MIDI.
– The best food in London.
– Analog high-end audio and vinyl.
– Autechre, Aphex Twin, Squarepusher, and any wonky music with a shedload of bass.


What’s happening with MIDI 2.0? 

The MIDI Association adopted MIDI 2.0 at the January 2020 Winter NAMM show. 

We had worked over 4 years in developing and editing those core specifications. 

Then a few weeks after the 2020 NAMM show a global pandemic struck and affected everything. 

How did the pandemic affect MIDI 2.0? The pandemic was as disruptive to MIDI 2.0 as it was to everything else.  

Although the sales of MIDI products of all types actually increased significantly during the pandemic as people stuck in their homes took refuge and solace in playing music, overall it did slow MIDI 2.0 progress.  Because of supply chain issues many companies were forced to re-design current products and planned products on their roadmaps to use different microprocessors. So instead of working on new things that were redesigning current models.

The MIDI Association has been meeting virtually for several years on the MIDIable collaboration platform developed by Lawrence Levine, (one of our Exec Board members), but the pandemic made it impossible to hold face to face meetings or plugfests. 

The MIDI Association faced the problem that all the work we had done on developing the initial specs for MIDI 2.0 was done strictly from a theoretical point of view.

No transports existed that could send the Universal MIDI Packet (UMP), no operating systems that supported UMP and no devices that either supported UMP or MIDI-CI. 

Look at the answers Mike Kent and Pete Brown recently posted on a popular blog site. 

How do you prototype something that doesn’t exist? 

So how do you prototype something that doesn’t exist. 

We needed 4 things to prototype MIDI 2.0 completely. 

  • Transports that support the Universal MIDI Packet (UMP)
  • Hardware Devices that support MIDI-CI specifications over current MIDI 1.0 transports
  • Operating Systems that support both UMP and have MIDI 2.0 APIs
  • Hardware Devices that support UMP and MIDI-CI

USB MIDI 2.0 -the first UMP Transport

The first step was to create a transport specification that could run the Universal MIDI Packet because even after that transport spec was create and adopted, operating systems would have to develop class compliant drivers and driver development takes time. 

USB Implementers Forum, Inc. (USB-IF) is a non-profit corporation founded by the group of companies that developed the Universal Serial Bus specification. USB  has its own set of rules, IP policies and bylaws. Fortunately there were a group of MIDI Association companies who were also USB-IF members.  So a number of companies including Apple, Focusrite. Native Instruments, Roland and Yamaha worked in the USB-IF to development the USB Class Definition for MIDI Devices v2.0. 

In June of 2020, the USB Class Definition for MIDI Devices v2.0 was adopted by the USB-IF which meant operating systems could start developing Class Compliant drivers and APIs. 


...

MIDI 2.0 Progress Continues with Updated USB Specification –  

As computers have become central components in many MIDI systems, USB has become the most widely used protocol for transporting MIDI data. With the introduction of MIDI 2.0, the USB Implementers Forum’s USB MIDI 2.0 working group, headed by members o

The Groovesizer TB2 – The First Open Source MIDI 2.0 Synth 

TB2 Groovesizer Open Source Synth

Groovesizers are kit-built DIY sequencers and synths. They’re open-source instruments, which means firmware for the Groovesizers can be freely examined, shared and changed in the beginner friendly Arduino IDE.

by Groovesizer

Andrew Mee, head of the MIDI 2.0 Prototyping Working group found an interesting Open Source Synthesizer,  the TB2 Groovesizer. 

With the Quartet firmware, the TB2 is a 4 voice paraphonic wavetable synth shield for the Arduino Due development board. The TB2 features 2 oscillators per voice, an ADSR envelope, LFO, digital filter, arpeggiator, as well as a 16-step sequencer.

For sound generation, the TB2 makes use of the pair of 12-bit DACs built into the Arduino Due’s 32 bit ARM processor. The TB2 uses an SD card for storing patches and sequences, and it also allows the user to load single cycle waveshapes for the two oscillators and the LFO. 

The MIDI Association reached out to Jean Marais, a South African living in Taiwan. Jean graciously agreed to make some TB2’s specifically for the MIDI Association and we sent TB2s outo some of our members so they could start prototyping MIDI 2.0. 

The Groovesizer only has a 5 PIn DIN MIDI In and a 5 PIn DIN Out.  So how can it be a MIDI 2.0 device?  MIDI 2.0 is defined in the specifications as something that can establish bi-directional MIDI connectivity that allows two devices to connect to each other and query their capabilities (MIDI Capability Inquiry) and auto configure themselves. 

So MIDI-CI Profile Configuration and MIDI-CI Property Exchange can be done on any MIDI 1.0 transports because the device discovery is done using MIDI 1.0 Universal SysEx Messages. 


MIDI 2.0 Operating System Support 

Apple, Google and Microsoft are all MIDI Association members. 

In fact, Torrey Holbrook Walker from Apple and Phil Burk from Google are both former MIDI Association Executive Board Members and Pete Brown from Microsoft is the current Chair of the Executive Board. 

We recently put out a press release for AMEI (Association of Music Electronics Industries), our partner in Japan in developing MIDI specifications that made an announcement about AMEI’s plans to support an Open Source driver for Microsoft and reviewed the current situation with Apple, Google and Linux.

The Association of Musical Electronics Industries (AMEI), the organization that oversees the MIDI specification in Japan, has committed to funding the development of an open-source USB MIDI 2.0 Host Driver for Windows Operating Systems under a memorandum of understanding between AMEI, AmeNote Inc, and Microsoft.

MIDI 2.0 is a global industry-wide effort. The MIDI Association (TMA), is the organization that oversees the MIDI specification in all other areas of the world besides Japan. TMA recently funded AmeNote’s development of the ProtoZOA, a USB MIDI 2.0 prototyping board that software developers can use to test with their MIDI 2.0 applications.

AmeNote’s plans to release large parts of the ProtoZOA firmware as open-source code. So, all hardware developers can utilize that code and incorporate it in their own MIDI 2.0 devices.

TMA members Apple and Google have already announced and released their support for MIDI 2.0. A number of AMEI members and TMA members have developed MIDI 2.0 compliance and testing tools that they plan to release for free public download on GitHub.

AMEI and TMA have also recently engaged with members of the ALSA community about the development of open-source drivers and APIs for the Linux platform. These new developments regarding Microsoft and Linux signal a further step in the development of the MIDI 2.0 ecosystem. They also highlight the continuing cooperative efforts of AMEI and TMA to work together to provide free MIDI 2.0 resources(tools/applications) and open source code to make the development of MIDI 2.0 products easier for companies of all sizes.

by The MIDI Association (on behalf of AMEI)

 Apple MIDI 2.0 Support

Google MIDI 2.0 Support

 Windows MIDI 2.0 Support


...

AMEI to Fund Open Source MIDI 2.0 Driver for Windows –  

November 1, 2022 – The Association of Musical Electronics Industries (AMEI), the organization that oversees the MIDI specification in Japan, has committed to funding the development of an open-source USB MIDI 2.0 Host Driver for Windows Operating Sys


ProtoZOA – The first USB MIDI 2.0 device 

To accelerate MIDI 2.0 development, the MIDI Association has helped fund ProtoZOA’s technical development and donated ProtoZOAs and TB2 Groovesizers at no charge to any MIDI Association member who wanted to join the prototyping effort.

These tools work together for prototyping and testing foundational MIDI 2.0 features such as the new Universal MIDI Packet, MIDI-CI Discovery, Profile Configuration, Property Exchange, USB MIDI 2.0, and MIDI 1.0 to 2.0 Translation.

With these advances, companies around the world now have the software and hardware tools needed to build, test, and ensure the compatibility of their MIDI 2.0 products.

Amenote developed the ProtoZOA using Raspberry PICO CPUs because they are openly accessible and extremely affordable.

ProtoZOA is a USB MIDI 2.0 device that software developers can use to test with their MIDI 2.0 applications and its firmware provides source code that hardware developers can incorporate in their own MIDI 2.0 devices. MIDI Association members are currently helping to test, and optimize the ProtoZOA code. 


...

The MIDI Association Announces MIDI 2.0 Development Tools –  

MIDI Association announces a global initiative by over 50 MIDI Association companies to prototype MIDI 2.0.


 Come see us at Audio Developer Conference 2022!

Jean Baptiste Thiebaut- MIDI Association Exec Board

If you are attending ADC in person, there are lots of MIDI Association members who will be at ADC. Also on top the presenters listed below,  we arranged to have Jean Baptiste (JB) Thibeaut (MIDI Association Exec Board member and co-founder of the MIDI Innovation Awards) attend.  JB is CEO of Music Hackspace now, but when he worked for ROLI he actually ran Audio Developers conference for many years.   Just look for the MIDI Association poster in the poster area and JB is happy to give you details about why now is a great time to join the MIDI Association. 

If you are attending ADC virtually,  Athan Billias, MIDI Association President is attending virtually.   Here is an image of his booth schedule for ADC and it also gives you a peek at what a typical week for the MIDI Association looks like.  

 As you can see, we are working hard every week to bring the promise of MIDI 2.0 to the world. 

Times on the far left are London time and next to them is Pacific.

AMEI to Fund Open SourceMIDI 2.0 Driver for Windows

November 1, 2022 –
The Association of Musical Electronics Industries (AMEI), the organization that oversees the MIDI specification in Japan, has committed to funding the development of an open-source USB MIDI 2.0 Host Driver for Windows Operating Systems under a memorandum of understanding between AMEI, AmeNote Inc, and Microsoft.

AMEI is underwriting the cost and has engaged AmeNote Inc. to develop the driver because of AmeNote’s extensive experience in MIDI 2.0 and USB development. In addition, concurrent to this, Microsoft has also agreed to start development of a Windows standard open-source MIDI 2.0 API. The driver and API will be developed in accordance with Microsoft’s quality control standards, and will be managed as a permissively licensed (MIT license) Microsoft open-source project. As a result, anyone can participate in the development as an open-source contributor in the future, or use the code in their own devices or operating systems. Because of this open source arrangement, continuous and timely improvements and enhancements to the USB MIDI 2.0 Host driver and MIDI 2.0 API are expected. Development is currently underway with the goal of completing the development in 2023.

MIDI 2.0 is a global industry-wide effort. The MIDI Association (TMA), is the organization that oversees the MIDI specification in all other areas of the world besides Japan. TMA recently funded AmeNote’s development of the ProtoZOA, a USB MIDI 2.0 prototyping board that software developers can use to test with their MIDI 2.0 applications.

AmeNote’s plans to release large parts of the ProtoZOA firmware as open-source code. So, all hardware developers can utilize that code and incorporate it in their own MIDI 2.0 devices.

TMA members Apple and Google have already announced and released their support for MIDI 2.0. A number of AMEI members and TMA members have developed MIDI 2.0 compliance and testing tools that they plan to release for free public download on GitHub.

AMEI and TMA have also recently engaged with members of the ALSA community about the development of open-source drivers and APIs for the Linux platform. These new developments regarding Microsoft and Linux signal a further step in the development of the MIDI 2.0 ecosystem. They also highlight the continuing cooperative efforts of AMEI and TMA to work together to provide free MIDI 2.0 resources(tools/applications) and open source code to make the development of MIDI 2.0 products easier for companies of all sizes.

About MIDI 2.0:
AMEI and TMA released the MIDI 2.0 Core Standards on February 25, 2020. Subsequently, in June 2020, the USB-IF, a management organization for USB technology, released the USB MIDI 2.0 standard, a communication standard for connecting MIDI 2.0 devices.

Currently, AMEI and TMA are working together on joint prototyping and some major updates to the core MIDI specifications in order to bring MIDI 2.0 devices to market as soon as possible.


The MIDI Association Announces MIDI 2.0 Development Tools

MIDI Association announces global initiative by over 50 MIDI Association companies to prototype MIDI 2.0. 

The MIDI Association has released details of the ongoing, industry-wide initiative by over 50 MIDI Association companies to develop MIDI 2.0 products and services. Despite the challenges presented by the pandemic, MIDI Association members persevered, and have addressed these challenges through global collaboration and cooperation.

Yamaha Corporation has funded the development of the MIDI Workbench, a software tool for MIDI 2.0 testing and compliance developed by Australian Andrew Mee.

MIDI Workbench MIDI 2.0 testing and compliance tool

The MIDI Workbench MIDI 2.0 is  a Standalone Electron Application for complete MIDI 2.0 environment. This workbench uses UMP internally and translates MIDI 1.0 (including MPE) to and from MIDI 2.0. 

Mee has also updated firmware for the TB2 Groovesizer, an Open Source MIDI 2.0 hardware synthesizer developed by Jean Marais, a South African living in Taiwan.

With the Quartet firmware, the TB2 is a 4 voice paraphonic wavetable synth shield for the Arduino Due development board.

The TB2 features 2 oscillators per voice, an ADSR envelope, LFO, digital filter, arpeggiator, as well as a 16-step sequencer.For sound generation, the TB2 makes use of the pair of 12-bit DACs built into the Arduino Due’s 32 bit ARM processor.

The TB2 uses an SD card for storing patches and sequences, and it also allows the user to load single cycle waveshapes for the two oscillators and the LFO. The Groovesizer only has a 5 PIn DIN MIDI In and a 5 PIn DIN Out. 

So how can it be a MIDI 2.0 device?  MIDI 2.0 is defined in the specifications as something that can establish bi-directional MIDI connectivity that allows two devices to connect to each other and query their capabilities (MIDI Capability Inquiry) and auto configure themselves. So MIDI-CI Profile Configuration and MIDI-CI Property Exchange can be done on any MIDI 1.0 transports because the device discovery is done using MIDI 1.0 Universal SysEx Messages. 

Canadian-based company AmeNote, founded by industry veterans Mike Kent (Chair of the MIDI 2.0 Working Group) and Micheal Loh (founder of iConnectivity) has designed ProtoZOA, a flexible Raspberry Pico based prototyping tool for MIDI 2.0.

To accelerate MIDI 2.0 development, the MIDI Association has helped fund ProtoZOA’s technical development and donated ProtoZOAs and TB2 Groovesizers at no charge to any MIDI Association member who wanted to join the prototyping effort.

These tools work together for prototyping and testing foundational MIDI 2.0 features such as the new Universal MIDI Packet, MIDI-CI Discovery, Profile Configuration, Property Exchange, USB MIDI 2.0, and MIDI 1.0 to 2.0 Translation.

With these advances, companies around the world now have the software and hardware tools needed to build, test, and ensure the compatibility of their MIDI 2.0 products.

Amenote developed the ProtoZOA using Raspberry PICO CPUs because they are openly accessible and extremely affordable.

ProtoZOA is a USB MIDI 2.0 device that software developers can use to test with their MIDI 2.0 applications and its firmware provides source code that hardware developers can incorporate in their own MIDI 2.0 devices. MIDI Association members are currently helping to test, and optimize the ProtoZOA code.

“Our plan is to release most of the ProtoZOA source code as Open Source with a permissive license. That will allow even non-MIDI Association members to use the code to develop MIDI 2.0 products.”

by Mike Kent from AmeNote and Chair of the MIDI 2.0 Working Group

With a hardware and software foundation for MIDI 2.0 development in place, companies can move forward to take advantage of the protocol’s advanced features—as well as maintain full backward compatibility with existing MIDI 1.0 gear.

“These MIDI 2.0 prototyping efforts highlight what is truly unique about the MIDI Association. 

Since MIDI’s genesis, companies that are traditionally fierce competitors have chosen cooperation as the best way to solve difficult problems—and to contribute those solutions at no charge back to the global community, for the benefit of musicians, artists, and engineers everywhere.”

by MIDI Association president, Athan Billias.


Media Contact:

Daniel Liston Keller

Get It In Writing

daniel@getitinwriting.net 

The MIDI Association Announces MIDI 2.0 Over A2B™

Los Angeles, CA – June 1, 2022.

The MIDI Association announced today the addition of MIDI 2.0 capabilities to the Automotive Audio Bus (A2B®) from Analog Devices, Inc. The technology was showcased at the MIDI Association booth #10300 at the NAMM show in Anaheim, California on June 3-5, 2022.

 A2B® is a high bandwidth, bidirectional, digital audio bus capable of distributing 32 channels of audio and MIDI control data together with clock and power over a single, unshielded twisted-pair wire. The technology enables the development of advanced, feature-rich and cost-effective audio and MIDI LAN systems in a variety of musical instrument and pro audio applications.

“We see many opportunities for the extension of Analog Device’s A2B® into musical instrument and pro audio applications, particularly for guitar effects, electronic drums, digital keyboards and small format audio mixers,” said David Dashefsky, Director for Strategic Marketing and Systems in the Consumer Business Unit at Analog Devices. “MIDI 2.0 and ADI’s A2B® digital audio bus now allow your whole band to connect together with multi-channel digital audio over low-cost cables, or inexpensively connect modular systems like guitar pedals or electronic drum kits”

MIDI (Musical Instrument Digital Interface) is the industry standard developed to connect digital musical instruments to each other, as well as to computers, tablets and smartphones. MIDI 2.0 is a significant upgrade to the original MIDI 1.0 specification adopted in 1983 and makes MIDI more expressive and easier to use by changing MIDI from a monolog to a dialog and allowing MIDI devices to discover shared capabilities and auto configure themselves. A2B® will support both MIDI 1.0 and MIDI 2.0 devices and provide backward compatibility/translation where necessary.

“Combining the multi-channel audio networking capabilities of the Analog Devices A2B® with MIDI’s expressive musical control creates a brand-new technology platform for the musical instrument and pro audio industry,” said Athan Billias, President of the MIDI Association. “This inexpensive platform to connect multiple digital instruments together is a big boon to designers of musical instrument and pro audio applications.”

At the June 2022 NAMM show in Anaheim, the MIDI Association showcased the A2B technology using the ProtoZOA USB MIDI 2.0 prototyping and testing tool developed by Amenote Inc.

The Amenote ProtoZOA USB MIDI 2.0 Prototyping Board connected to an Analog Devices A2B module at June NAMM 2022


A lot of FX pedals are digital these days, yet each pedal has an analog to digital convertor on one side that converts the analog signal in to digital, the signal gets processed and then on the other side of the pedal is a digital to analog converter that converts the signal back to analog again.  Digital keyboards, electronic drums and almost all digital musical instruments work the same way.  A2B® with MIDI 2.0 eliminates the need for all that A/D and D/A conversion by providing a low cost digital audio network with MIDI control protocol.  It solves a fundamental problem that has existed since the early 1980s when digital musical instruments and FX first became possible. 

by Athan Billias, President of the MIDI Association

Athan Billias, Pat Scandalis, Mike Kent and Pravin Angolkar discuss MIDI 2.0 over A2B at A3E



Dolby featured a car with Dolby Atmos in their NAMM booth. The worlds of Autos and Audio are truly colliding.
Autonomous vehicles will make it possible to have recording studios on wheels



The MIDI Association Announces the v1.1 Update to the MIDI Polyphonic Expression (MPE) Specification

Los Angeles CA, May 1, 2022 – The MIDI Association announces the release of an update (v1.1) to the MIDI Polyphonic Expression (MPE) (v1.0, January 2018) specification.

In March of 2021, the MIDI Association’s MPE Working Group reconvened to evaluate the current MPE specification and create an editorial revision. The working group has gone over the MPE v1.0 specification with the goal of making it clearer and easier to adopt. No technical design changes were made while every point of confusion or misinterpretation we’ve encountered over the past few years has been addressed. All the implementation rules are now gathered together into section 2 “Detailed Specification”. Further, the best practices and appendices have been elaborated to provide greater detail and examples.

The next step for the MPE Subcommittee is to create a MIDI 2.0 MIDI-CI Profile for MPE, taking advantage of MIDI-CI to enable auto-configuration between MPE senders and receivers, with a goal to complete this work in 2022.

MIDI Polyphonic Expression (MPE) makes it possible for artists to perform independent gestures for each musical note, with up to three dimensions of expression. Without MPE, expressive gestures on synthesizers—such as pitch bending or adding vibrato—affected all notes being played. With MPE, every note a musician plays can be articulated individually for much greater expressiveness. At the present time over one hundred DAWs, Synthesizers and Controllers support MPE.

The members of the MPE Working Group include:

Gregory Pat Scandalis – moForte Inc

Athan Billias – MIDI Association-President/ MIDI2Marketing LLC

Mike Kent – AmeNote Inc

Geert Bevin – Moog Music Inc

Eric Bateman – Keith McMillan Instruments

Andrew Mee – Yamaha Consultant

Jonathan Keller – Zivix

The working group also consulted with a variety of other hardware manufacturers, plugin developers and DAW vendors to make sure their input was reflected into the updated specification.

The new specification is already available for download on the MIDI.org website.

https://www.midi.org/specifications/midi1-specifications/mpe-midi-polyphonic-expression 



Music Tectonics Podcast 2.5 Billion MIDI Devices: How MIDI is Powering the World

From the fountains at the Bellagio to Beethoven’s 10th Symphony, MIDI has been powering live music and immersive experiences for nearly 40 years. In this week’s episode, Tristra Newyear Yeager sits down with Athan Billias and Brett Porter from the MIDI association to explore the latest innovations in MIDI 2.0. 


Learn more about the MIDI Association, a cross industry coalition of MIDI makers and innovators working to bring the MIDI protocol to over 2.5 billion devices around the world. Discover how artists, producers, DJs, educators, and more are using the Musical Instrument Digital Interface, or MIDI, in new ways. Find out how the MIDI Association is partnering with companies like Artiphon to create Orbacam, the latest MIDI powered app allowing users to create music directly in videos. What’s next for MIDI? Find out in this week’s episode!


1:33 What is MIDI?

5:07 The MIDI Association

8:14 MIDI 2.0

11:30 4th Annual Music Tectonics Conference Early Bird Tickets

13:07 More MIDI!

21:21 New MIDI products

29:10 Integrating music experiences with MIDI

36:30 Let’s get sci-fi ? 


Click the link below to listen to the podcast 


...

2.5 Billion MIDI Devices: How MIDI is Powering the World with Athan Billias and Brett Porter

From the fountains at the Bellagio to Beethoven’s 10th Symphony, MIDI has been powering live music and immersive experiences for nearly 40 years. In this week’s episode, Tristra Newyear Yeager sits down with Athan Ballias and Brett Porter from the MIDI association to explore the latest innovations in MIDI 2.0. Learn more about the MIDI Association, a cross industry coalition of MIDI makers and innovators working to bring the MIDI protocol to over 2.5 billion devices around the world. Discover ho


AES Inside Track features MIDI 2.0 discussion with Jennifer Hruska, Rick Cohen, Mike Kent, and Dave Smith

MIDI 2.0 — What’s New and How it Will Affect the Way We Work 

Join Jennifer Hruska as she moderates a discussion with leading experts about this major update to the MIDI specification. The original specification was released in 1983 and for 35 years remained at version 1.0. It has literally revolutionised the way in which electronic instruments and other devices are controlled. The new revision is completely backward compatible and codifies many of the technical developments manufacturers have introduced in recent years. This session provides an overview of MIDI 2.0 and examines the implications for users and the industry. 



...

AES Member Portal

The AES Member Portal allows members to quickly access everything they have access to.


MIDI 2.0 logo in Dezeen’s top ten rebrands of 2021

Dezeen names top rebrands of 2021 and lists Facebook’s Meta, Volvo and the Whitehouse along with MIDI 2.0

Dezeen is the world’s most popular and influential architecture, interiors and design magazine, with over three million monthly readers and six million social media followers.

Pentagram makes this list again with its brand identity for Musical Instrument Digital Interface (MIDI), the global standard that allows digital musical instruments to talk to each other.

Yuri Suzuki, a musician and Pentagram partner, worked with the graphic designer and partner Sascha Lobe on the update. The logo, which looks like an abstract letter M, replaces its previous wordmark and follows the 2020 release of MIDI 2.0, the first major update of the standard in over 35 years.

by Dezeen




For more information, check out the full articles at the links below. 


...

Dezeen’s top 10 rebrands of 2021

As part of our 2021 review, we look back on a year of rebrands including Facebook changing its name to Meta and Volvo adopting a flat logo.


...

Pentagram designs “future-proof” sonic logo for digital music standard MIDI

MIDI, the global standard that allows digital musical instruments to communicate, has a new brand identity based on sound waves designed by Pentagram.


Geert Bevin discusses MPE and MIDI 2.0 on the Gaz Williams Show

Geert was instrumental in getting MPE standardized 

Geert currently works at Moog as a product manager and software engineer, but he was at the center of the MPE revolution and helped develop firmware for several key MPE products including the Roger Linn LinnStrument and the Eigenharp.  The whole Gaz Williams show is 2 hours, but we jumped right to the section where Geert gives a concise explanation of MPE and then talks about the current MPE work we are doing in the MIDI Association. 



Pat Scandalis from Geoshred heads up the current MIDI Association MPE working group

The MPE working group is doing editorial revisions the current MPE spec to make it easier to understand and implement and then will work on taking that current spec and adding MIDI-CI Profile capabilities to it to ensure greater interoperability between MPE devices.   

As Geert said in many ways, MPE is the gateway to MIDI 2.0. 


MIDI 2.0 Developers Webinar- Saturday, May 22 10 am Pacific

Join members of the  MIDI Association Technical Standards Board and MIDI 2.0 working group chairs for an in-depth look at all the MIDI 2.0 specifications under development

As you can see from the table above, there is a lot of technical standards activity in The MIDI Association’s working groups.

The MIDI 2.0 Working Group handles all the Profiles, Property Exchange and Protocol Message discussions. We have started work on a completely new branch of MIDI 2.0 (adding to the “Three P’s” of Profiles Configuration, Property Exchange and Protocol Negotiation) which is currently called MIDI Implementation Discovery. This new branch of MIDI 2.0 has a lot of exciting possibilities.

There is a separate Transport Working Group which is currently focused on developing a network transport to add to the work already completed on USB.

The SMF2 Working Group is focused on developing a file format that includes MIDI 1.0, MIDI 2.0, notation, audio and other media in a single container. 

Some of the Profiles Working Groups are self explanatory like Drawbar Organ, Piano, Rotary Speaker, etc.  

One that may not be so apparent is the Orchestral Articulation profile.   This is a new design that uses the Attribute Type and Attribute Data fields in the MIDI 2.0 Note On and Note Off to put articulation data directly into the MIDI Notes. There are many great orchestral libraries available, but they all have their own ways of selecting articulations so there is no standardization.  

This new Orchestral Articulation Profile would allow different libraries to easily be used together  by creating a clear, standardized table for articulations and including that information directly in a MIDI 2.0 Note on and Off.  It would also allow writing a part for one instrument and easily switching to another while maintaining all the musically appropriate articulations. 



...

Synth and Software’s MIDI 2.0 interview –  

Synth and Software’s Nick Batzdorf interviews the MIDI Association about MIDI 2.0, Orchestral Articulations and more! Check out this interview with MIDI Association Exec Board member Athan BIllias with Nick Batzdorf from Synth and Software


Click on the Google Calendar image below to add the MIDI 2.0 Developer meeting on May 22 to your Google calendar.


Getting started with MIDI 2.0 development – Live workshop series with the Music Hackspace

This event is organised by the Music Hackspace.

Date & Time: Mondays 7th / 14th / 21st / 28th June 6pm UK / 7pm Berlin / 10am LA / 1pm NYC. Series of 4 x 2-hour workshops.

Level: Intermediate, Some experience with C++ coding required, Experience with JUCE recommended

Attendees should have experience building and debugging applications using Xcode (macOS) and Visual Studio (Windows).

Requirements:

  • A computer and internet connection
  • A webcam and mic
  • A Zoom account
  • Xcode (macOS)/Visual Studio (Windows)

Who is this workshop for:

Developers wanting to learn how MIDI 2.0 works under the hood, and how to get started writing software for it right away

Overview of what participants will learn:

This workshop will provide developers with knowledge and code for starting MIDI 2.0 development. At first, the concepts of MIDI 2.0 are explained. Then, the participants will co-develop a first implementation of a MIDI-CI parser for robust device discovery, and for querying and offering profiles. For that, a stub workspace will be provided. Exercises will let the participants practice the newly learned concepts. Last, but not least, this workshop also includes automated testing as a tool to verify the implementation.

Session 1: Overview of MIDI 2, concepts

Session 2: Workspace setup, Basic MIDI 2.0 Discovery

Session 3: Advanced MIDI 2.0 discovery and tests

Session 4: Implementing Profiles. Outlook PE and UMP.

At the end of the workshop series, the participants will:

  • Know the core concepts of MIDI 2.0
  • Understand the MIDI 2.0 discovery protocol
  • Be able to build products with MIDI 2.0 discovery
  • Be able to build products using MIDI 2.0 Profiles
  • Use an initial set of MIDI 2.0 unit tests

Topics:

  • MIDI 2.0 concepts
  • MIDI 2.0 Discovery
  • MIDI 2.0 Profiles
  • Unit Testing


Workshop leaders

Brett Porter

Lead Software developer, Artiphon

Brett is a member of the Executive Board of the MIDI Association,  chair of the MIDI 2 Prototyping and Testing Working Group. He is based in the New York City area.


Florian Bömersa
Founder, Bome software

Florian has been an active MIDI 2.0 working group member since its inception. He serves on the Technical Standards Board of the MIDI Association and chairs the MIDI 2.0 Transports Working Group. He is based in Munich, Germany. 

Details about MIDI 2.0, MIDI-CI, Profiles and Property Exchange (Updated June, 2023)

This article is for companies looking to develop MIDI 2.0 products, both software developers and hardware manufacturers. If you are a MIDI user looking for the benefits of MIDI 2.0, go to this article, which is a more general overview of MIDI 2.0 features. 


What Musicians & Artists need to know about MIDI 2.0

This article is to explain the benefits of MIDI 2.0 to people who use MIDI.  If you are a MIDI developer looking for the technical details about MIDI 2.0, go to this article updated to reflect the major updates published to the core MIDI 2.0 specs in June 2023.  MIDI 2.0 Overview Back in 1983, musical instrument companies that c…


https://www.midi.org/midi-articles/what-musicians-and-artists-need-to-know-about-midi-2-0


THE NEWEST CORE MIDI 2.0 SPECIFICATIONS ARE AVAILABLE FOR DOWNLOAD BY REGISTERING YOUR EMAIL ADDRESS ON THIS WEBSITE



Foundational specifications for MIDI 2.0 were released in February 2020, which allowed MIDI Association members, including Operating System companies to start actually implementing MIDI 2.0. 

Since that time the MIDI Association and its members have been developing tools for and prototyping the first MIDI 2.0  products. 

As a result, we have been improving and enhancing the specifications. 

The MIDI Association and AMEI adopted the following notable specifications on May 29, 2023 and these are now the specifications that should be used as references for developing MIDI products.

If you previously downloaded MIDI 2.0 specifications, please download the latest revisions


CURRENT CORE MIDI 2.0 SPECIFICATIONS 

These core specifications define the architectural foundations for MIDI 2.0 and the MIDI 2.0 Specification overview and defines minimum requirements for devices to claim MIDI 2.0 compatibility and to apply to use the MIDI Association’s MIDI 2.0 logo.

  • M2-100-U MIDI 2.0 Specification Overview, Version 1.1 NEW
  • M2-101-UM MIDI Capability Inquiry (MIDI-CI), Version 1.2 NEW
  • M2-102-U Common Rules for MIDI-CI Profiles, Version 1.1 NEW
  • M2-103-UM Common Rules for Property Exchange, Version 1.1
  • M2-104-UM Universal MIDI Packet (UMP) Format and MIDI 2.0 Protocol, Version 1.1 NEW
  • M2-116-U MIDI Clip File (SMF2), Version 1.0 NEW

Documents labeled as NEW are notable updates released in June 2023.

There are numerous other MIDI 2.0 specifications which build on these core documents available for download and listed below.


NEW MIDI 2.0 Specification Overview- Version 1.1

The MIDI 2.0 Specification Overview outlines those documents which define the core architecture of MIDI 2.0 It also defines the Minimum Compatibility Requirements for MIDI 2.0

Summary of Minimum Compatibility Requirements for MIDI 2.0

(Updated June 2023)

Any Device which claims MIDI 2.0 compatibility shall implement either A or B, or both A and B:

A. MIDI-CI* to at least its minimum requirements, including discovery mechanisms, plus any one or more of the following features:

  • One or more Profiles controllable by MIDI-CI Profile Configuration messages.
  • Any Property Data exchange by MIDI-CI Property Exchange messages.
  • Any Process Inquiry exchange by MIDI-CI Process Inquiry messages.

B. The UMP Data Format** to at least its minimum requirements, including discovery mechanisms, plus any one or more of the following features:

  • MIDI 2.0 Channel Voice Messages as defined by the Universal MIDI Packet (UMP) Format and MIDI 2.0 Protocol.
  • Jitter Reduction Timestamps as defined by the Universal MIDI Packet (UMP) Format and MIDI 2.0 Protocol.
  • System Exclusive 8 as defined by the Universal MIDI Packet (UMP) Format and MIDI 2.0 Protocol.
  • Mixed Data Set as defined by the Universal MIDI Packet (UMP) Format and MIDI 2.0 Protocol.


NEW MIDI Capability Inquiry (MIDI-CI)- Version 1.2

MIDI Capability Inquiry (MIDI-CI) is a set of bidirectional mechanisms which allow devices to negotiate with each other for autoconfiguration. MIDI-CI also allows the expansion of MIDI with new features while protecting backward compatibility with MIDI Devices that do not understand these newly defined features.

Latest Notable Changes:

Adds Process Inquiry. Deprecates Protocol Negotiation (replaced by new Protocol Selection mechanisms in (UMP) Format and MIDI 2.0 Protocol) . Adds Function Blocks. Added Profile Details Inquiry. Added features for backward and forward compatibility through future update versions.


NEW Common Rules for MIDI-CI Profiles, Version 1.1

The Common Rules for Profiles complements MIDI-CI by defining a set of design rules for all Profile Specifications.

Latest Notable Changes:

Update to align with updates to MIDI-CI and UMP Format & MIDI 2.0 Protocol specifications. Added Profile Details Inquiry messages. Added Profile Added Report and Profile Removed Report messages. Added Function Block Profiles and made resulting adjustments to Profile Addresses. 


Common Rules for Property Exchange, Version 1.1

The Common Rules for MIDI-CI Property Exchange, complements MIDI-CI by defining a set of design rules for all Property Exchange Resources and Properties.

Unchanged since 2020. 


NEW Universal MIDI Packet (UMP) Format and MIDI 2.0 Protocol

The Universal MIDI Packet provides a standardized modern packet format for all MIDI messages, both MIDI 1.0 Protocol and MIDI 2.0 Protocol. This specification also defines messages in the MIDI 2.0 Protocol.

Latest Notable Changes:

  • Added the notion of Groupless messages. Some UMP Message Types are not sent to a specific Group. These messages are intended to be processed by the MIDI Endpoint.
  • Utility messages (Message Type 0x0) are now Groupless. Group field is changed to Reserved.
  • Added Section 6, Function Blocks.
    • A Device may have one or more Function Blocks. A Function Block is a single functional component or application that operates on a set of one or more Groups. Function Blocks are used to separate a Device’s functional components into manageable units. System Messages are also now sent to a Function Block instead of to each Group, reducing the need for multiple messages.
  • Added MIDI Endpoint Messages (Message Type 0xF)
    • These Groupless messages are used to discover details about a MIDI Endpoint and its Function Blocks. Discovery of Max System Exclusive 8 Messages has been moved to these messages.
  • Added Flex Data Messages (Message Type 0xD)
    • Flex Data Messages are used to send messages to a Channel, Group or a Function Block. New Flex Data Messages include Lyric and Text messages as well messages useful to the MIDI Clip Specification.
  • Added Utility Messages for use in the MIDI Clip File specification.
  • Deprecated MIDI-CI Protocol Negotiation
    • These mechanisms have been replaced by MIDI Endpoint mechanisms and the Protocol Request and Protocol Notification Utility Messages.
  • Added a Per-Note Pitch Sensitivity Registered Controller Message
  • Clarifying specific CC/RPN Message translation between MIDI 1.0 and MIDI 2.0 Protocol
  • Added MIDI 2.0 Addressing Appendix to help clarify if messages are meant for a Channel, Group, Function Block or MIDI Endpoint.


Other Adopted MIDI 2.0 Specifications available on the site for download. 

  • Property_Exchange_Foundational_Resources
  • Property_Exchange_Mode_Resources
  • Property_Exchange_ProgramList_Resource
  • Property_Exchange_Channel_Resources
  • Property_Exchange_LocalOn_Resource
  • Property_Exchange_MaxSysex8Streams_Resource
  • Property_Exchange_Get_and_Set_Device_State
  • Property_Exchange_StateList
  • Property_Exchange_ExternalSync_Resource
  • Default_Control_Change_Mapping_Profile
  • MIDI_2-0_Bit_Scaling_and_Resolution
  • Property_Exchange_Controller_Resources


You must be logged in as a TMA member to download the spec. If you are not a member yet (or not logged in) , clicking on the link will take you to the signup page to either create an account or log in. 


CORPORATE MEMBERS HAVE ACCESS TO ALL SPECIFICATIONS INCLUDING THESE AND OTHERS UNDER DEVELOPMENT: 

  • UMP Network Transport for Ethernet and Wireless Connectivity
  • UMP Serial Transport
  • Piano Profile
  • Orchestral Articulation Profile for MIDI 2.0 Note On and Note Off
  • MPE Profile
  • OS API Implementation Guide
  • Camera Control Profile
  • DAW and Mixer Control Profile
  • MIDI 2.0 Container File Format (SMF2)


What is MIDI 2.0?



Back in 1983, musical instrument companies that competed fiercely against one another nonetheless banded together to create a visionary specification—MIDI 1.0, the first universal Musical Instrument Digital Interface. Nearly four decades on, it’s clear that MIDI was crafted so well that it has remained viable and relevant. Its ability to join computers, music, and the arts has become an essential part of live performance, recording, smartphones, and even stage lighting. Now, MIDI 2.0 takes the specification even further, while retaining backward compatibility with the MIDI 1.0 gear and software already in use. Here’s why MIDI 2.0 is the biggest advance in music technology in decades.

MIDI 2.0 Means Two-way MIDI Conversations

MIDI 1.0 messages went in one direction: from a transmitter to a receiver. MIDI 2.0 is bi-directional and changes MIDI from a monologue to a dialog. For example, with the new MIDI-CI (Capability Inquiry) messages, MIDI 2.0 devices can talk to each other, and auto-configure themselves to work together. They can also exchange information on functionality, which is key to backward compatibility—MIDI 2.0 gear can find out if a device doesn’t support MIDI 2.0, and then simply communicate using MIDI 1.0.

Higher Resolution, More Controllers and Better Timing

To deliver an unprecedented level of nuanced musical and artistic expressiveness, MIDI 2.0 re-imagines the role of performance controllers, the aspect of MIDI that translates human performance gestures to data computers can understand. Controllers are now easier to use, and there are more of them: over 32,000 controllers, including controls for individual notes. Enhanced, 32-bit resolution gives controls a smooth, continuous, “analog” feel. New Note-On options were added for articulation control and precise note pitch. In addition, dynamic response (velocity) has been upgraded. What’s more, major timing improvements in MIDI 2.0 can apply to MIDI 1.0 devices—in fact, some MIDI 1.0 gear can even “retrofit” certain MIDI 2.0 features.

Profile Configuration

MIDI gear can now have Profiles that can dynamically configure a device for a particular use case. If a control surface queries a device with a “mixer” Profile, then the controls will map to faders, panpots, and other mixer parameters. But with a “drawbar organ” Profile, that same control surface can map its controls automatically to virtual drawbars and other keyboard parameters—or map to dimmers if the profile is a lighting controller. This saves setup time, improves workflow, and eliminates tedious manual programming.

Property Exchange

While Profiles set up an entire device, Property Exchange messages provide specific, detailed information sharing. These messages can discover, retrieve, and set many properties like preset names, individual parameter settings, and unique functionalities—basically, everything a MIDI 2.0 device needs to know about another MIDI 2.0 device. For example, your recording software could display everything you need to know about a synthesizer onscreen, effectively bringing hardware synths up to the same level of recallability as their software counterparts.

Built for the Future.

MIDI 2.0 is the result of a global, decade-long development effort. Unlike MIDI 1.0, which was initially tied to a specific hardware implementation, a new Universal MIDI Packet format makes it easy to implement MIDI 2.0 on any digital transport (like USB or Ethernet). To enable future applications that we can’t envision today, there’s ample space reserved for brand-new MIDI messages.

Further development of the MIDI specification, as well as safeguards to ensure future compatibility and growth, will continue to be managed by the MIDI Manufacturers Association working in close cooperation with the Association of Musical Electronics Industry (AMEI), the Japanese trade association that oversees the MIDI specification in Japan.

MIDI will continue to serve musicians, DJs, producers, educators, artists, and hobbyists—anyone who creates, performs, learns, and shares music and artistic works—in the decades to come.


DATA FORMATS: MIDI 1.0 BYTE STREAM DATA FORMAT AND UNIVERSAL MIDI PACKET FORMAT


1983: MIDI 1.0 originally defined a byte stream data format and a dedicated 5 pin DIN cable as the transport. When computers became part of the MIDI environment, various other transports were needed to carry the byte stream, including software connections between applications. What remained common at the heart of MIDI 1.0 was the byte stream data format.

The MIDI 1.0 Data Format defines the byte stream as a Status Byte followed by data bytes. Status bytes have the first bit set high. The number of data bytes is determined by the Status. 

The Universal MIDI Packet (UMP) Format, introduced as part of MIDI 2.0, uses a packet-based data format instead of a byte stream. Packets can be 32 bits, 64 bits, 96 bits, or 128 bits in size. 

This format, based on 32 bit words, is more friendly to modern processors and systems than the byte stream format of MIDI 1.0. It is well suited to transports and processing capabilities that are faster and more powerful than those when MIDI 1.0 was introduced in 1983.

The first nibble in each packet declares the Message Type. Each Message Type has a defined size and standard format. 

Example: MIDI 2.0 Protocol messages in the UMP Format have higher resolution data and can have added properties in new data fields. Here is a comparison of a MIDI 1.0 Note On message in the byte stream format and a MIDI 2.0 Note On message in UMP Format. 


Addressing: UNIVERSAL MIDI PACKET FORMAT EXPANDS BEYOND MIDI 1.0 FORMAT


In MIDI 1.0 byte stream format, the value of the Status Byte of the message determines whether the message is a System Message or a Channel Voice Message. System Messages are addressed to the whole connection. Channel Voice Messages are addressed to any of 16 Channels, with the Channel number declared in the 2nd nibble of the Status Byte. 

The Universal MIDI Packet introduces an optional Group field for messages. Each Message Type is defined to be addressed with a Group or without a Group field (“Groupless”). 

These mechanisms expand the addressing space beyond that of MIDI 1.0. Groupless Messages are addressed to the whole connection. Other messages are addressed to a specific Group, either as a System message for that whole Group or to a specific Channel within that Group. 

All MIDI 1.0 System Messages and Channel Voice messages have equivalent messages within a single Group of the UMP Format. Each Group is roughly comparable to a MIDI 1.0 connection when performing Translations between MIDI 1.0 Byte Stream Format and UMP Format. So 16 Groups can be translated to/from 16 MIDI 1.0 byte stream connections. 


TRANSPORTS AND FILES: WITH UNIVERSAL MIDI PACKET FORMAT 

The Universal MIDI Packet is intended to be the native format for all future MIDI systems supporting both MIDI 1.0 and MIDI 2.0. New or updated, bidirectional transports and new file formats are needed to use the Universal MIDI Packet Format.

UNIVERSAL SERIAL BUS

USB has become a widely used transport for MIDI 1.0. The USB-IF working group has published the USB Class Definition for MIDI Devices v2.0 specification. This uses the Universal MIDI Packet as its native data format for connections between devices and computers.. This includes a mechanism for fall back to USB MIDI 1.0. Both specifications are available for download from https://www.usb.org/documents?search=midi

NETWORK

The MIDI Association’s Transport Working Group is currently developing a specification to use the Universal MIDI Packet on network connections for multiple devices, with or without a computer.

OPERATING SYSTEM API FOR MIDI 2.0 – APPLE, GOOGLE, MICROSOFT, LINUX

The MIDI Association’s OS API Working Group is developing implementation guidelines for developers. Key members include representatives for operating systems from Apple, Google, and Microsoft, as well as the ALSA team for Linux. These members have been cooperating to unify architectural concepts as much as possible in their diverse environment so that developers can find core, common concepts across platforms.

https://www.midi.org/midi-articles/midi-is-about-collaboration-not-competition

STANDARD MIDI FILE Version 2

MIDI 2.0 requires new Standard MIDI File formats to support the UNiversal MIDI Packet. There are 2 file formats expected, one published and another under development.

  1. This MIDI Clip File specification defines a Standard MIDI File format which supports all MIDI 1.0 data and all MIDI 2.0 data. The MIDI Clip File serves a role which is similar to the version 1 Standard MIDI File Type 0, with all data in a single sequence of Universal MIDI Packet messages.
  2. The MIDI Container File specification is currently under development, with expected release in 2024. This MIDI Container File supports the Universal MIDI Packet data format by containing one or more MIDI Clip Files, arranged in one or more tracks. This container format also expands the applications and markets for Standard MIDI Files by allowing the inclusion of other media files including musical notation, audio, video, and application-specific files.

The Standard MIDI File version 1 specifications for Type 0 and Type 1 defined certain non-MIDI data as Meta Events. In fact, MIDI SMF1 was developed before the concept of metadata was well defined in the multimedia industry and was ahead of its time in codifying authors, publishers, tempo, key signatures and other aspects of musical performance that have now become standardized for the Internet.

MIDI 2.0 provides more space to define new messages, so these Meta Events have been replaced by new MIDI messages that can be sent over MIDI transports. These new messages can be found in the UMP Format and MIDI 2.0 Protocol specification, V1.1. 


JITTER REDUCTION TIMESTAMPS FOR TRANSPORTS 

The Universal MIDI Packet format adds an optional Jitter Reduction Timestamp mechanism using 2 messages:

  1. The JR Clock message defines the current time of the Sender. The Sender sends independent JR Clock messages, not related to any other message.
  2. The JR Timestamp message can be prepended to any MIDI 1.0 Protocol message or MIDI 2.0 Protocol message for improved timing accuracy over any transport.

DELTA CLOCKSTAMPS FOR SMF2 

Delta Clockstamp mechanism is used in a MIDI Clip File to declare precise timing of events in a sequence file. A Delta Clockstamp Ticks Per Quarter Note sets the timing resolution and accuracy. Then Delta Clockstamp messages declare the number of ticks since the last event. The timing of every non-Clockstamp message is set by the most recent preceding Delta Clockstamp. 


DISCOVERY


UMP Endpoint

The UMP Format defines mechanisms for Devices to discover fundamental properties of other Devices to connect, communicate and address messages. Discoverable properties on a UMP Endpoint include:

1. Device Identifiers: Name, Manufacturer, Model, Version, and Product Instance Id (e.g. Serial Number).

2. Data Formats Supported: Version of UMP Format*, MIDI Protocols, and whether Jitter Reduction Timestamps can be used.

3. Device Topology: including which Groups are currently valid for transmitting and receiving messages and which Groups are available for MIDI-CI transactions.

These properties can be used for Devices to auto-configure through bidirectional transactions, thereby enabling the best connectivity between the Devices. These properties can also provide useful information to users for 

Function Blocks

A Device uses Function Block messages to report topology information including the Group address(es) in use. It also reports metadata including the name of the function, MIDI-CI support, hints for user interfaces which expose senders and receivers for user connection choices, and more.

MIDI-CI – MIDI Capability Inquiry

MIDI-CI is used to discover information about the MIDI features found on each Function Block, Group, and/or MIDI Channel of a Device. MIDI-CI includes queries for three major areas of MIDI functionality:

  1. Profile Configuration: Profiles define specific implementations of a set of MIDI messages chosen to suit a particular instrument, Device type, or to accomplish a particular task. Two Devices that conform to the same Profile will generally have greater interoperability between them than Devices using MIDI without Profiles. Profiles increase interoperability and ease of use while reducing the amount of manual configuration of Devices by users.
  2. Property Exchange is used to Inquire, Get, and Set many properties including but not limited to Device configuration settings, a list of controllers and resolution, a list of patches with names and other metadata, manufacturer, model number, and version. All data is expressed using JSON key:value properties inside System Exclusive messages.
  3. Process Inquiry allows discovery of the current state of properties which are controllable by MIDI Messages in a device including:

• Values controlled by System Messages

• Values controlled by Channel Controller Messages

• Values controlled by Note Data Messages 


MIDI MESSAGES

The Universal MIDI Packet Data Format can carry MIDI 1.0 Protocol messages or MIDI 2.0 Protocol messages.  


MIDI 1.0 Protocol Inside the Universal MIDI Packet

All existing MIDI 1.0 messages are carried in the Universal MIDI 1.0. As an example, this diagram from the protocol specification shows how MIDI 1.0 Channel Voice Messages are carried in 32-bit packets:  

System messages, other than System Exclusive, are encoded similarly to Channel Voice Messages. System Exclusive messages vary in size, can be very large, and can span multiple Universal MIDI Packets.  


MIDI 2.0 Protocol Messages


The MIDI 2.0 Protocol uses the architecture of MIDI 1.0 Protocol to maintain backward compatibility and easy translation while offering expanded features.

  • Extends the data resolution for all Channel Voice Messages.
  • Makes some messages easier to use by aggregating combination messages into one atomic message.
  • Adds new properties for several Channel Voice Messages.
  • Adds several new Channel Voice Messages to provide increased Per-Note control and musical expression.
  • Adds New data messages include System Exclusive 8 and Mixed Data Set. The System Exclusive 8 message is very similar to MIDI 1.0 System Exclusive but with 8-bit data format. The Mixed Data Set Message is used to transfer large data sets, including non-MIDI data.
  • Keeps all System messages the same as in MIDI 1.0.

Expanded Resolution and Expanded Capabilities

This example of a MIDI 2.0 Protocol Note message shows the expansions beyond the MIDI 1.0 Protocol equivalent. The MIDI 2.0 Protocol Note On has higher resolution Velocity. The 2 new fields, Attribute Type and Attribute data field, provide space for additional data such as articulation or tuning details 

Easier to Use: Registered Controllers (RPN) and Assignable Controllers (NRPN)

Creating and editing RPNs and NRPNs with MIDI 1.0 Protocol requires the use of compound messages. These can be confusing or difficult for both developers and users. MIDI 2.0 Protocol replaces RPN and NRPN compound messages with single messages. The new Registered Controllers and Assignable Controllers are much easier to use.

The MIDI 2.0 Protocol replaces RPN and NRPN with 16,384 Registered Controllers and 16,384 Assignable Controller that are as easy to use as Control Change messages.

Managing so many controllers might be cumbersome. Therefore, Registered Controllers are organized in 128 Banks, each Bank having 128 controllers. Assignable Controllers are also organized in 128 Banks, each Bank having 128 controllers.

Registered Controllers and Assignable Controllers support data values up to 32bits in resolution.

MIDI 2.0 Program Change Message

MIDI 2.0 Protocol combines the Program Change and Bank Select mechanism from MIDI 1.0 Protocol into one message. The MIDI 1.0 mechanism for selecting Banks and Programs requires sending three MIDI messages. MIDI 2.0 changes the mechanism by replicating the Banks Select and Program Change in one new MIDI 2.0 Program Change message. Banks and Programs in MIDI 2.0 translate directly to Banks and Programs in MIDI 1.0.

The MIDI 2.0 Program Change message always selects a Program. A Bank Valid bit (B) determines whether a Bank Select is also performed by the message.

If Bank Valid = 0, then the receiver performs the Program Change without selecting a new Bank; The receiver keeps its currently selected Bank. Bank MSB and Bank LSB data fields are filled with zeroes.

If Bank Valid = 1, then the receiver performs both Bank and Program Change.

Other option flags that are not defined yet and are Reserved. 


New Data Messages for MIDI 1.0 Protocol and MIDI 2.0 Protocol 

New data messages include System Exclusive 8 and Mixed Data Set. The System Exclusive 8 message is very similar to MIDI 1.0 System Exclusive but with 8-bit data format. The Mixed Data Set Message is used to transfer large data sets, including non-MIDI data. Both messages can be used when using the Universal MIDI Packet format for MIDI 1.0 Protocol or MIDI 2.0 Protocol.  


1.6 The Future of MIDI 1.0

MIDI 1.0 is not being replaced. Rather it is being extended and is expected to continue, well integrated with the new MIDI 2.0 environment. It is part of the Universal MIDI Packet, the fundamental MIDI data format. Many MIDI devices will not need any of the new features of MIDI 2.0 in order to perform all their functions. Some devices will continue to use the MIDI 1.0 Protocol while using other extensions of MIDI 2.0, such as Profile Configuration or Property Exchange.  


What’s Next 

MIDI 1.0 works really well. In fact, MIDI 2.0 is just more MIDI. As new features arrive on new instruments, they will work with existing devices and system. The same is true for the long list of other additions made to MIDI since 1983. MIDI 2.0 is just part of the evolution of MIDI that has gone on for 36 years. The step by step evolution continues.

Various Working Groups of the MIDI Association hold weekly or bi-weekly meetings to continue the development of MIDI 2.0.

A great example of the collaboration inside the MIDI Association is the meeting that happened in Tokyo, April 2023. All of the major OS companies gathered with Japanese MIDI manufacturers to talk about details of handling MIDI 2.0 in operating systems. In the true spirit of MIDI, companies set aside their differences and competitive natures to cooperate for the greater benefit of musicians around the world. 


MIDI is about collaboration, not competition

All kinds of companies, all kinds of devices One of the things that has always made MIDI unique in the world of standards is that no one owns MIDI and the MIDI Associations (AMEI in Japan and The MIDI Association in the rest of the world) don’t sell anything.  We (AMEI and The MIDI Association) get companies to volunteer their s…


https://www.midi.org/midi-articles/midi-is-about-collaboration-not-competition

On the Monday after each NAMM Show, some of the dedicated team of volunteer MMA and AMEI members attend an all day planning meeting to map out the plans for the next year.

Front Row : 

  • Florian Bomers – Bome Software, MMA Technical Standards Board
  • Kaz Hikawa- Crimson Technology, AMEI MIDI Standards Committee 
  • Athan Billias- Yamaha, MMA Exec Board
  • Koichi Mizumoto, Roland – AMEI , Future of MIDI Expansion (MIDI 2.0) working group chair
  • Mike Kent, MK2 Image, MMA Technical Standards Board, MIDI CI working group chair

Second Row:

  • Rick Cohen- Qubiq, MMA Technical Standards Board Chair, Protocol Working Group chair
  • Franz Detro, Native Instruments
  • Evan Balster , imitone
  • Daisuke Miura- Yamaha, AMEI Technical Standards Board Chair
  • Shunsuke Tanaka- Crimson Technology
  • Takahisa Ishiguro – Stone System. 
  • Lawrence Levine – Horn, MMA Exec Board, MIDI 2.0 Marketing working group co-chair

Holding MIDI 2.0 banner:

  • Andrew Mee – Yamaha consultant for Property Exchange
  • Rob Rampley- Media Overkill 


Why Join the MMA (MIDI Manufacturers Association)

If you are a developer of MIDI software or hardware, there are a lot of reasons to join the MIDI Manufacturers Association now.  This article includes some information about MIDI 2.0, but it is definitely not enough to start developing a MIDI 2.0 product.  The reason we do not release specification details before they are finally approved is that if information is released too early and then changes are made, it can lead to interoperability problems.  

If you join the MMA now, you not only get access to the current version of the full MIDI 2.0 specification, but will also have a voice in future MIDI specifications including Profiles and Property Exchange messages.   

To implement MIDI-CI and MIDI 2.0, you need a manufacturers SysEx ID.  A SysEx ID by itself is $260 a year, but it is included with your MMA membership.  You will also have access to the MMA Github which has code for MIDI 2.0 to MIDI 1.0 translation (and vis versa),  MIDI 2.0 Scope, a tool for sending and testing MIDI 2.0 messages developed by Art and Logic and Property Exchange Work Bench, an application developed by Yamaha for prototyping and testing Property Exchange. 

We are also working on a MIDI 2.0 logo and logo licensing program. 

So we encourage you to join the MIDI Manufacturers Association now and get access to all the documents you will need to get a head start on MIDI 2.0. 


MIDI 2.0 FAQs


We have been monitoring the comments on a number of websites and wanted to provide some FAQs about MIDI 2.0 as well as videos of some requested MIDI 2.0 features. 

Will MIDI 2.0 devices need to use a new connector or cable?

No, MIDI 2.0 is a transport agnostic protocol.

  • Transport- To transfer or convey from one place to another
  • Agnostic- designed to be compatible with different devices
  • Protocol-a set of conventions governing the treatment and especially the formatting of data in an electronic communications system

That’s engineering speak for MIDI 2.0 is a set of messages and those messages are not tied to any particular cable or connector.

When MIDI first started it could only run over the classic 5 Pin DIN cable and the definition of that connector and how it was built was described in the MIDI 1.0 spec.

However soon the MIDI Manufacturers Association and Association of Music Electronic Industries defined how to run MIDI over many different cables and connectors.

So for many years, MIDI 1.0 has been a transport agnostic protocol.. 

MIDI 1.0 messages currently run over 5 PIN Din, serial ports, Tip Ring Sleeve 1/8″ cables, Firewire, Ethernet and USB transports.

Can MIDI 2.0 run over those different MIDI 1.0 transports now?

Yes, MIDI 2.0 products can use MIDI 1.0 protocol and even use 5 Pin DIN if they support the Automated Bi-Directional Communication of MIDI-CI and :

  • One or more Profiles controllable by MIDI-CI Profile Configuration messages.
  • Any Property Data exchange by MIDI-CI Property Exchange messages.
  • Any Process Inquiry exchange by MIDI-CI Process Inquiry messages.

However to run the Universal MIDI Packet and take advantage of MIDI 2.0 Voice Channel messages with expanded resolution, there needs to be new specifications written for each transport

The new Universal Packet Format that will be common to all new transports defined by AMEI and The MIDI Associaton. The new Universal Packet contains both MIDI 1 .0 messages and MIDI 2.0 Voice Channel Messages plus some messages that can be used with both.

The most popular MIDI transport today is USB. The vast majority of MIDI products are connected to computers or hosts via USB. 

The USB specification for MIDI 2.0 is the first transport specification completed, but we are working on a UMP Network Transport for Ethernet and Wireless Connectivity

Can MIDI 2.0 provide more reliable timing?

Yes. Products that support the new USB MIDI Version 2 UMP format can provide higher speed for better timing characteristics. More data can be sent between devices to greatly lessen the chances of data bottlenecks that might cause delays.

UMP format also provides optional “Jitter Reduction Timestamps”. These can be implemented for both MIDI 1.0 and MIDI 2.0 in UMP format. 

With JR Timestamps, we can mark multiple Notes to play with identical timing. In fact, all MIDI messages can be tagged with precise timing information. This also applies to MIDI Clock messages which can gain more accurate timing.

Goals of JR Timestamps:

  • Capture a performance with accurate timing
  • Transmit MIDI message with accurate timing over a system that is subject to jitter
  • Does not depend on system-wide synchronization, master clock, or explicit clock synchronization between Sender and Receiver.

Note: There are two different sources of error for timing: Jitter (precision) and Latency (sync). The Jitter Reduction Timestamp mechanism only addresses the errors introduced by jitter. The problem of synchronization or time alignment across multiple devices in a system requires a measurement of latency. This is a complex problem and is not addressed by the JR Timestamping mechanism.

Also we have added Delta Time Stamps to the MIDI Clip File Specification. 

Can MIDI 2.0 provide more resolution?

Yes, MIDI 1.0 Voice Channel messages are usually 7 bit (14 bit is possible by not so widely implemented because there are only 128 CC messages). 

With MIDI 2.0 Voice Channel Messages velocity is 16 bit. 

The 128 Control Change messages, 16,384 Registered Controllers, 16,384 Assignable Controllers, Poly and Channel Pressure, and Pitch Bend are all 32 bit resolution.

Can MIDI 2.0 make it easier to have microtonal control and different non-western scales?

Yes, MIDI 2.0 Voice Channel Messages allow Per Note precise control of the pitch of every note to better support non-western scales, arbitrary pitches, note retuning, dynamic pitch fluctuations or inflections, or to escape equal temperament when using the western 12 tone scale ( see videos).


Yutaka Hasegawa, Chairman of AMEI (the Japanese MIDI organization)

The really exciting part of Automated Bi-Directional Communication is that it paves the way for a new industry standard MIDI protocol which could enable new features like higher resolution, more channels and improved performance and expressiveness (while still maintaining backwards compatibility with current MIDI 1.0 devices). A new MIDI protocol would offer a bridge between music technology and new emerging technologies in other industries and allow creators, performers, and consumers to enjoy new and exciting musical experiences in the future.

by Yutaka Hasegawa, former chairman of AMEI



Pentagram Designs the New MIDI Logo

From Master Card to Mister Mayor all the Waze to MIDI 

Pentagram , the global multi-disciplinary, independently-owned design studio with offices in Austin, Berlin, New York and London has done some of the most original design work on the planet.  They do more than logo design although they have done that for clients like Master Card, Mister Mayor and Waze.




Pentagram does graphics and identity, products and packaging, exhibitions and installations, websites and digital experiences, advertising and communications, sound and motion.

Pentagram was commissioned to head up MIDI’s brand identity design to maximise exposure and adoption of MIDI 2.0. Taking centre stage in the identity, the trademark is inspired by musical forms, such as the Stuttgart pitch, which is an oscilloscope reading of sine waves at a frequency of 440 Hz. The Stuttgart pitch serves as a tuning standard for the musical note of A above middle C, or A4 in scientific pitch notation. A440 has been widely adopted as a reference frequency to calibrate acoustic equipment and to tune various musical instruments.

The wordmarque design also references the shape of Lissajous curves, which are graphs of a system of parametric equations used to describe complex harmonic motion. The finalised design represents a modulation shape between 440 Hz – 880 Hz which is globally recognised as a tone for tuning instruments.

The sonic logo complements the wordmarque design, creating a mirror between sound and vision. The pitch starts out at 440 Hz and then rises to 880 Hz, with subtle wave shape and stereo modulation. There is an anticipatory feeling to the sonic identity, similar to that of an orchestra tuning to 440 Hz or Strauss’ ‘Also Sprach Zarathustra’. The simplicity and power of these pitches can create a Pavlovian response. Minimal orchestral strings complement the sine waves.

MIDI’s new identity feeds on the interplay between the visual and sonic, creating a suitably iconic logo for the next generation of music makers and music lovers. The MIDI 2.0 visual campaign is based on the heard and unheard, a synesthesia of sorts that visualises sound and vice versa. Sound is sentimental and can be imagined even in the absence of audible clues with unheard outputs such as imagery and motion giving way to soundscapes trained by synaptic communication in our brains that create memories.

MIDI is instantly recognisable to anyone who makes, edits or plays music, and is used every day around the world by musicians, DJs, producers, educators, artists and hobbyists to create, perform, learn and share music and artistic works. Since 1981, MIDI has dramatically changed the way musical instruments communicate, connecting electronic music instruments, computers and other audio devices. For the first time ever, products from different manufacturers were able to seamlessly interact with one another, stimulating computer music sequencing programmes to take form and become the backbone of modern day music-making.

The MIDI Association released the first makeover of the product in 2020 and branded it as MIDI 2.0, which was the most significant update in 35 years. MIDI 2.0 is a future-proof upgrade which introduces novel features and added functionality as MIDI’s use has grown to include computers, tablets and smartphones. One of the main attributes of MIDI 2.0 is bi-directionality, which enables MIDI 2.0 devices to communicate in both directions.

by Pentagram




The final result is this timeless logo that captures the essence of MIDI 2.0 


...

Pentagram — The world’s largest independent design consultancy

Pentagram is the world’s largest independent design consultancy. The firm is owned and run by 24 partners, each of whom are leaders in their individual fields.

MIDI 2 0 for Developers YouTube Video

Check out the MIDI 2.0 webinar for developers from May Is MIDI Month 2020

00:00:00 Athan Billias, Welcome to the Webinar 

00:01:43 Rick Cohen,How MIDI Standards are Developed 

00:05:30 Mike Kent, MIDI 2.0 Overview 

00:31:03 Pete Brown, MIDI 2.0 and Microsoft 

00:37:35 Brett Porter, About MDI 2.0 Scope 

00:44:03 Andrew Mee, About MIDI 2.0 Workbench and Property Exchange

00:51:48 Wrap Up, Questions and Answers

01:14:00 Contact Information 

Due to some quality issues with some of the video feeds, this is an Audio Podcast. 

The MMA adopts 8 new MIDI 2.0 specifications

The MIDI Manufacturers Association adopted 5 core MIDI 2.0 specifications on February 20, 2020.

 MIDI 2.0 Specification Overview

This document defines the specific collection of MMA/AMEI specifications that collectively comprise the core MIDI 2.0 Specification and introduces the fundamental concepts of MIDI 2.0. The document also defines minimum requirements for Devices to claim MIDI 2.0 compatibility. 

MIDI Capability Inquiry (MIDI-CI) 

 MIDI-CI defines an architecture that allows Devices with bidirectional communication to agree to use extended MIDI capabilities beyond those defined in MIDI 1.0, while carefully protecting backward compatibility. MIDI-CI features “fall back” mechanisms so that if a Device does not support new features MIDI continues to work as defined by MIDI 1.0. Goals of MIDI-CI design:

Fully backward compatible: supports continued MIDI 1.0 functionality for any Devices that do not recognize extended MIDI features enabled by MIDI-CI.

  1. Allow easy configuration between MIDI-CI Devices.
  2. Sender can know the capabilities of a Receiver.
  3. Sender and Receiver can negotiate auto-configuration details
  4. Define method for negotiating choice of Protocol between Devices.
  5. Define method for using Profiles.
  6. Define method for Discovering, Getting, and Setting a wide range of Device Properties.

Common Rules for MIDI-CI Profiles 

MIDI-CI allows devices to communicate their capabilities to each other. Devices can use that capabilities information to self-configure their MIDI connections and related settings. Profiles are a beneficial component in enabling intelligent auto-configuration.

A Profile is a defined set of rules for how a MIDI receiver device implementing the Profile shall respond to a chosen set of MIDI messages to achieve a particular purpose or to suit a particular application. In addition to defining response to MIDI messages, a Profile may optionally also define other device functionality requirements. This definition also then implies MIDI implementation of a sender or in some cases may require a defined MIDI implementation of a sender. 

Common Rules for MIDI-CI Property Exchange 

Property Exchange is a set of MIDI-CI messages used to access a wide range of properties in MIDI devices. The exchange of properties takes place between a MIDI-CI Initiator and a MIDI- CI Responder.
This Common Rules for Property Exchange document provides a complement to the MIDI-CI specification by defining details of the Property Exchange mechanism and rules for the data payload in MIDI-CI Property Exchange messages. Further Property Exchange specifications define schemas and various data payloads that use the rules in MIDI-CI and this document to achieve specific tasks. 

Universal MIDI Packet (UMP) Format and MIDI 2.0 Protocol 

This Specification defines two major extensions to the MIDI 1.0 Protocol:

  • Universal MIDI Packet (UMP) Format
    UMP can contain all MIDI 1.0 Protocol messages and all MIDI 2.0 Protocol messages in a single, common container definition with a payload format which is intended to be usable in (or easily adaptable for) any standardized or proprietary data transport.
  • MIDI 2.0 Protocol
    The MIDI 2.0 Protocol is an extension of the MIDI 1.0 Protocol. Architectural concepts and semantics remain the same as MIDI 1.0. Compatibility for translation to/from the MIDI 1.0 Protocol is given high priority in the design of the MIDI 2.0 Protocol.

These core MIDI 2.0 specifications are all available for download (please log in to download). 


8 new MIDI 2.0 Specifications adopted (total 13)

 MIDI-CI Property Exchange Foundational Resources: DeviceInfo, ChannelList, JSONSchema 

This specification is based on the MIDI-CI Capability Inquiry Specification and the Common Rules for MIDI-CI Property Exchange Specification.

Property Exchange (PE) defines that devices may have Resources, each one being a set of one or more Properties (Property Data) in a device which are accessible by a Property Exchange inquiry and MIDI-CI Transactions.

Foundational Resources provide core Properties of PE Devices and serve to enable other Resources. Many other Resources depend on or make use of Properties discovered in Foundational Resources. This specification defines three Foundational Resources.

The DeviceInfo Resource provides core details about the identity of a Property Exchange Device. It contains the same data as the Device Inquiry Universal SysEx Message. DeviceInfo also includes human-readable Properties for Manufacturer, Family, Model, Version information, and more.

ChannelList is a List Resource which describes the current MIDI Channels in use across the whole Device or, in the case of a Device using the Universal MIDI Packet Format, the current MIDI Channels in use across one Group of the Universal MIDI Packet Format. It describes the current Channel, Program, Group, MPE Status and other properties.

The “JSONSchema” Resource provides the JSON Schema for Manufacturer Specific Resources “schema” property. See [MMA03] Common Rules for MIDI-CI Property Exchange Section 11.3.2 for more information. 

MIDI-CI Property Exchange Mode Resources: ModeList, Current Mode  

This specification defines two Resources, ModeList and CurrentMode. If a Property Exchange Device has Modes, then it should support the ModeList Resource and CurrentMode Resource.

ModeList is a List Resource which describes the different Modes available in the Device. A Mode is a fundamental configuration of a Device. A change of Mode might change the response to MIDI messages, might change the number of active MIDI Channels, and might change the contents of Payload Data on the Device for any supported Resource.

The CurrentMode is a Simple Property Resource used to get or set the current Mode. The list of Modes available is retrieved using the ModeList Resource. 

MIDI-CI Property Exchange ProgramList Resource  

ProgramList is a List Resource which provides the list of Programs available in a Program Collection. A Program Collection is a grouping of Programs with some common trait (bank, category, instrument, synthesis engine, presets, etc). 

MIDI-CI Property Exchange Get and Set Device State  

The Property Exchange Resources described in this document allow for an Initiator to send or receive Device State, or in other words, to capture a snapshot which might be sent back to the Device at a later time. The primary goal of this application of Property Exchange is to GET the current memory of a MIDI Device. This allows a Digital Audio Workstation (DAW) or other Initiator to store the State of a Responder Device between closing and opening of a project. Before a DAW closes a project, it performs the GET inquiry and the target Device sends a REPLY with all data necessary to restore the current State at a later time. When the DAW reopens a project, the target Device can be restored to its prior State by sending an Inquiry: Set Property Data Message.

Data included in each State is decided by the manufacturer but typically might include the following properties (not an exhaustive list):

  • Current Program
  • All Program Parameters
  • Mode: Single Patch, Multi, etc. Current Active MIDI Channel(s) Controller Mappings
  • Samples and other binary data Effects
  • Output Assignments 

 MIDI-CI Property Exchange Channel Resources: ChannelMode, BasicChannelRx, BasicChannelTx 

This document defines three Property Exchange Resources: ChannelMode, BasicChannelRx, and BasicChannelTx. These Resources are used to Get and Set information related to the choice of MIDI Channels which are actively in use by a Device.  

MIDI-CI Property Exchange Local On Resource  

 This document defines the LocalOn Resource which uses Property Exchange to Get and Set the “Local On/Off” setting of a Property Exchange Device.

MIDI-CI Property Exchange MaxSysex8Streams Resource 

This document defines the MaxSysex8Streams Resource which uses Property Exchange to discover how many simultaneous System Exclusive 8 messages are supported by a Device when using the MIDI 2.0 Protocol.  

MIDI-CI Property Exchange ExternalSync Resource  

This document defines the ExternalSync Resource. If a Property Exchange Device has a clock which is able to synchronize to external MIDI sync messages, then the Device should support the ExternalSync Resource.  


More MIDI 2.0 innovation on the horizon 

MMA members (and members of AMEI, the Japanese MIDI organization) are working incredibly hard on new and exciting MIDI 2.0 possibilities. There are working groups for Standard MIDI 2 File Format,  new MIDI 2.0 transports and multiple Profile working group sub committees including active Profile subgroups for Default CC Mapping, Orchestral Articulations, Guitar Controllers, Wind Controllers, DAW control and Piano. 

The MMA does not publish details of specifications before they are adopted because that could lead to interoperability issues, but this list of current activities gives an idea of how much work is currently going on. 


New MIDI Specification Area of the site coming soon!

With all the new MIDI 2.0 specifications coming out and more being planned, we decided we needed a better way for people to read about and download all MIDI specifications. 

The 8 new MIDI 2.0 specifications outlined above will soon be made available for download in a updated MIDI Specification section of the website that we have been working on. 

We have also been working on editorial updates to older MIDI specification to remove some culturally insensitive language that was contained in older MIDI 1.0 specifications.  

All MMA members are committed to improving our processes and communications and we always welcome feedback from people on the site. 

You can always reach out us here. https://www.midi.org/contact-us

MIDI 2.0 Progress Continues with Updated USB Specification

As computers have become central components in many MIDI systems, USB has become the most widely used protocol for transporting MIDI data. With the introduction of MIDI 2.0, the USB Implementers Forum’s USB MIDI 2.0 working group, headed by members of the MIDI Manufacturers Association (MMA) and the Association of Musical Electronics Industry (AMEI), have updated the USB Class Definition for MIDI Devices. This spec defines how USB transports MIDI data.
The key to the updated spec is use of the new Universal MIDI Packet to support MIDI 2.0’s new functionality, while retaining backward compatibility with MIDI 1.0.

Highlights of operation over USB include:

·Better support for deterministic, high-speed throughput (up to hundreds of times MIDI’s original speed)
·Old and new devices work with any operating system that supports the updated USB Class Definition for MIDI Devices
·As with the previous version, no drivers needed for compliant devices
·Supports up to 256 MIDI Channels in 16 Groups of the new Universal MIDI Packet
·Can provide more accurate timing for dense MIDI streams
·Devices can run multiple Endpoints to use more than 256 Channels
·Over time, simpler to implement than USB MIDI 1.0
·Enhances the use of MIDI 2.0 mechanisms including MIDI Capability Inquiry (MIDI-CI)
·Supports both MIDI 1.0 Protocol and MIDI 2.0 Protocol Data
·Devices can declare UMP Group IN/Out pairs for use by MIDI-CI
·Devices can declare that more than one UMP Group is used for a shared or related function.
·Added Bandwidth descriptors for more predictable use of higher speeds.
·Added support for Interrupt transactions as well as Bulk (USB MIDI 1.0 uses Bulk only) for more deterministic control over jitter and throughput.

According to Mike Kent of the USB-IF Audio Working Group and MMA Technical Standards Board, “The new spec provides operating system manufacturers, like Apple, Microsoft, Google, Linux, and others, a clearly defined path for moving forward with MIDI 2.0.”

As with MIDI itself, the new USB Class Definition has been a collaborative effort, involving multiple companies from around the world. In conjunction with its members, the MMA and AMEI will continue enhancing the MIDI 2.0 specification, and assist manufacturers with incorporating MIDI 2.0’s features into the next generation of music gear.

Details on the USB Class Definition for MIDI Devices specification are available from the link below.

We encourage all MIDI developers to download the specification and we will adding articles to MIDI.org with more details on the USB-MIDI 2.0 specification soon. 

VST 3.7 SDK Includes MIDI 2.0 Support

HAMBURG, Germany — Steinberg todayannounced the immediate availability of its latest VST SDK. VST 3.7 introduces several enhancements of the SDK interface that allow new levels of integrations between VST 3 hosts and plug-ins, including the new VST 3 Project
Generator, improved documentation as well as support for MIDI 2.0 and the development of plug-ins compatible with the new ARM-based Apple Silicon chips.

About VST

Initially launched in 1996, VST creates a professional studio environment on PC and Mac computers, allowing the integration of virtual effect processors and instruments into any VST host application. Being an open standard, the possibilities offered by VST continuously grow. 2008 sees the release of VST 3, now featuring multiple MIDI ports, surround sound capability and side chaining. Since then, many new capabilities have been added to Version 3 and hence has received great acceptance across the entire industry.

VST 3 Project Generator

VST 3.7 introduces the VST 3 Project Generator that further facilitates the entry into the VST development world. The VST 3 Project Generator allows users to create a VST 3 plug- in project with just few clicks, which can then be used as the code skeleton in Xcode or Visual Studio.

VST 3 Online developer resource

The VST SDK documentation has been enhanced and can now be accessed online. The detailed documentation provides information on how to develop plug-ins, also including tutorials with lots of examples for both beginners and advanced developers.

Page 1 of 2

MIDI 2.0 and ARM Support

The MIDI 2.0 standard announced by MMA (MIDI Manufacturers Association) is already widely supported by VST 3. Detailed documentation on how to employ the MIDI 2.0 enhancements with VST 3 is available at the Steinberg Developer Resource.Also new is that the VST 3.7 SDK also supports the development of plug-ins compatible with the new ARM-based Apple Silicon chips.

Arne Scheffler, senior software engineer for VST at Steinberg commented: “VST is built on the ideas and creativity of the global creative community. We want to lower the entry barrier for the next generations of VST developers, and we can’t wait to see what they come up with.”

Availability

The SDK can be used under free license and is available for download on the Steinberg website.

Roland Announces “MIDI 2.0 Ready” A-88MKII MIDI Keyboard Controller

Roland (LVCC Central Hall, Exhibit #10719), one of the co-founders of the original MIDI specification, introduces the A-88MKII, a powerful new 88-note MIDI keyboard controller for studio and stage. The latest in a long line of professional MIDI controllers from Roland, the A-88MKII is supremely playable, with modern creative tools for today’s musicians and producers. Slim and ergonomically designed, the A-88MKII comes equipped with an acclaimed, weighted-action keyboard, plus cutting-edge features like USB-C connectivity, RGB-lit controls, and more. It’s also MIDI 2.0 ready, making it the first Roland instrument to support the new MIDI standard. 

by Roland

Standard Features

  • Unmatched playability with Roland’s own PHA-4 keyboard
  • Built-to-last with wood and premium materials for years of dependable performance
  • Eight RGB-lit assignable knobs, eight RGB-lit pads, and three pedal inputs
  • Three customizable zones, onboard arpeggiator, and chord memory
  • Thin design with shallow depth fits today’s studio environments
  • Control application for deep customization and instant recall
  • USB-C connectivity and bus power
  • MIDI 2.0 ready

Visit the MMA at the Roland CES booth (LVCC Central Hall, Exhibit #10719)

The MIDI Manufacturers Association will be spending most of our time at CES in the Roland booth (LVCC Central Hall, Exhibit #10719) so if you are at CES and are interested in MIDI 2.0, please come by the Roland booth and say hi. 

We’ll be the ones with the MIDI Manufacturers Association CES show badges. 

For more information about the A-88MKII visit the Roland website below.  

For more information about MIDI 2.0, this website is the place for the latest information about the anything that relates to MIDI. 


...

Roland – A-88MKII | MIDI Keyboard Controller

A-88MKII: MIDI Keyboard Controller – Play. Control. Rejoice.

MIDI 2.0 at the 2020 NAMM Show

Calendar of MIDI Events at NAMM 2020


Thursday January 16 


Who: MMA and CMIA Members plus invited guests 

What: Meeting with the Chinese Musical Instrument Association (CMIA) to discuss promotion of MIDI at Music China 2020

When: Thursday, January 16 • 9 AM- 10 AM

Where: Contact the MMA at contact@midi.org. 


Who: All NAMM Attendees

What: MIDI Manufacturers Association Booth

When: Thursday, January 16 • 10 AM- 6 PM

Where: NAMM Booths 9701 and 9700


Friday January 17


Who: All NAMM Attendees

What: MIDI Manufacturers Association Booth

When: Friday, January 17 • 10 AM- 6 PM

Where: NAMM Booths 9701 and 9700 


Who: Open to All NAMM Attendees

What: A3E Workshop

Strategic Overview and Introduction to MIDI 2.0

When: Friday, January 17th • 2:00 PM – 3:00 PM

Where: The Hilton: Level 4: Avila 6

This Friday session will teach you about the Future of MIDI! The developers of MIDI 2.0 technology will describe the features and benefits enabled by MIDI-CI (MIDI Capability Inquiry), Profiles, Property Exchange, and MIDI 2.0 Protocol. This is an introductory presentation. There is a “MIDI 2.0 for Developers” Saturday presentation with more in-depth details.

When you leave the session, you will know the answers to these questions:

• What features of MIDI 2.0 enable more musical expression?
• What features of MIDI 2.0 improve my workflow?
• How does MIDI 2.0 Protocol differ from MIDI 1.0 Protocol?
• What will a MIDI 2.0 system look like (wiring, topology, architecture)?
• How will my MIDI 1.0 and MIDI 2.0 products work together as 1 system?



Saturday January 18 


Who: Management of the MMA and AMEI (Association of Musical Electronics Industry, the Japanese MIDI organization

What: MMA and AMEI management meeting

When: Saturday, January 18 • 8 AM- 10 AM

Where: Anaheim Marriott Newport Beach / Rancho Las Palmas Room 


Who: All NAMM Attendees

What: MIDI Manufacturers Association Booth

When: Thursday, January 16 • 10 AM- 6 PM

Where: NAMM Booths 9701 and 9700


 Who: Open to All NAMM Attendees

What: A3E Workshop

Update for Windows Music Creation App Developers

When: Saturday, January 18th • 12:00 PM – 1:00 PM

Where: The Hilton: Level 4: Avila 6

• How is Windows doing?
• What’s this I hear about Linux on Windows?
• What about MIDI?
• How is the app model evolving?

Join Pete Brown to learn about the latest for Windows Developers, with a focus on current changes that impact or help developers working on DAWs and other music creation apps.


 Who: Open to All NAMM Attendees

What: A3E Workshop

MIDI 2.0 for Developers, a Technical Deep Dive

When: Saturday, January 18th • 1:30 PM – 2:30 PM

Where: The Hilton: Level 4: Avila 6

Following on the topics presented in the Friday “Strategic Overview and Introduction to MIDI 2.0” session, this Saturday session focuses on implementing MIDI 2.0 in products.

Developers will learn about:

• MIDI-CI (MIDI Capability Inquiry): the foundation for MIDI 2.0 features
• Profiles: using standard configurations or your own custom
• Property Exchange: using JSON with Get and Set for device control
• Protocol: new MIDI 2.0 messages, format and the Universal MIDI Packet
• Timestamps, Transports, Translation, and more.
• Compliance and the new MIDI logo
• MMA prototyping tools and GitHub


Sunday January 19 


Who: MMA and AMEI members only

What: MIDI Manufacturers Association AGM Breakfast

When: Sunday, January 19 • 7:30 AM- 8:45 AM

Where: Anaheim Marriott Grand Ballroom J&K 


Who: MMA and AMEI members plus invited guests

What: MIDI Manufacturers Association Annual General Meeting 

When: Sunday, January 19 • 9:00 AM- 9:50 AM

Where: 

Anaheim Marriott Grand Ballroom Grand Ballroom G&H 

Topics:

  • Introduction of new MMA management
  • MIDI 2.0 Specifications Update
  • MIDI 2.0 Logo and Marketing Update
  • The MIDI Association/May is MIDI Month 2020 Update


Elections:

– Executive Board Members (Directors) – Technical Standards Board Members 

How to RSVP to attend the AGM Open Session

If you would like to attend the MIDI Manufacturers Association Annual General Meeting, please click on the link below and fill out this RSVP form. We will send you an email confirming if your request to attend has been granted as there is limited space. 

MMA Annual General Meeting RSVP


Who: MMA and AMEI members plus invited guests

What: MIDI Tools and MIDI 2.0 Prototyping

When: Sunday, January 19 • 1:30 PM- 3:00 PM

Where: Anaheim Marriott Grand Ballroom Grand Ballroom G&H 

MIDI 2.0 Tools:

  • A MIDI 2.0 Compliance WorkSheet for determining which MIDI 2.0 features you support
  • MIDI 2.0 Scope — an open source application (C++/JUCE) to aid in development/debugging of MIDI 2.0 channel voice message transmission and reception. Includes a permissively licensed JUCE User Module that implements the new protocol’s messages. 
  • Online MIDI-CI Work Bench to test MIDI-CI, Profiles and Property Exchange using Web MIDI
  • MIDI-CI Work Bench- A Standalone Electron Application for a more complete MIDI 2.0 environment. This workbench uses UMP internally and translates MIDI 1.0 (including MPE) to and from MIDI 2.0


MIDI 2.0 Demo:

A Demo by AMEI companies Korg, Roland and Yamaha showing three different prototypes recieving MIDI 2.0 messages. 

How to RSVP to attend the MIDI Tools and MIDI 2.0 Prototyping Session

If you would like to attend the MIDI Manufacturers Association MIDI Tools and MIDI 2.0 Prototyping Session please click on the link below and fill out this RSVP form. We will send you an email confirming if your request to attend has been granted as there is limited space. 

MIDI Tools and MIDI 2.0 Prototyping Session

ADC 2019 Features MIDI 2.0 and more

ADC- the largest audio developers conference in the world  

The Audio Developer Conference (ADC) started 5 years ago and has quickly become the leading conference for audio and MIDI programming professionals to gather and discover the latest trends in music production software development. 

This year ADC features  presentations on MIDI 2.0 from some of the key people behind the new specifications plus plenty of innovative uses of current MIDI specifications.

 Date And Time

Mon, 18 Nov 2019 –Wed, 20 Nov 2019

08:00 –21:00 GMT

Add to Calendar

Location

Code Node 10 South Pl. London EC2M 7EB United Kingdom

View Map 


Introducing MIDI 2.0 Article

At ADC’19, we will learn more about the upcoming MIDI 2.0 specification. We spoke with MIDI Manufacturers Association members Athan Billias, Mike Kent, Florian Bomers, and Brett Porter about some of the changes and improvements to expect.

by ADC

It’s amazing to me that MIDI is a protocol that has remained largely the same since it’s invention in the early 80s. Can you give us a brief overview of how MIDI became the standard for connecting digital instruments and computers together? 

The early 80’s was a remarkable time in the history of musical instrument technology. The whole music world was transitioning from electronic devices that used control voltages to digital devices that could store preset sounds in a main CPU. The Musical Instrument Digital Interface was developed by a small group companies including Sequential Circuits, Roland, Yamaha and Korg, but quickly became an industry standard.

Actually MIDI has been remarkably resilient and adaptive as the music production landscape has changed over the years. In the 90’s , MIDI Show Control was adopted and the motion picture industry started to use SMPTE Time Code to sync music and film. In the 2000s, as softsynths and digital DJs became popular, MIDI keyboard and pad controllers became the center of music creation and production. In the last decade, we have seen MIDI evolve further with wireless Bluetooth capabilities for phones and tablets, MIDI Polyphonic Expression for increased expressiveness and the rise of Arduino and DIY MIDI devices. The three winners of the Guthman Musical Instrument Design Awards in 2019 were all MIDI controllers so MIDI is still at the center of innovation. You can check out a whole series of articles on MIDI.org at The History of MIDI.

by Athan Billias (Yamaha):

With so many companies using the MIDI 1.0 standard, how did the team go about finding a consensus around the best way to update the protocol, and how long has the team been working on MIDI 2.0?

Indeed, at the MIDI Manufacturers Association, we have the goal of reaching full consensus for every part of a new specification. This is a great thing, because the resulting standard will meet the requirements of all the very different member companies.

On the other hand, contentious issues can take a long time to resolve. And we also need consensus with the Japanese counterpart organization AMEI. It’s been an interesting ride since we started the work on MIDI 2.0 in 2005! We had to change direction a couple of times. But the major turning point for reaching broad consensus was when we completely redesigned the internal draft specification: fewer initial features, separate independent sub-specifications instead of one monolithic spec, and more focus on seamless integration with MIDI 1.0. The outcome is MIDI-CI as enabler for the announced specifications: Profiles, Property Exchange, and MIDI 2.0 Protocol.

by Florian Bömers (Bome Software):

What are the biggest changes that we will see in MIDI 2.0, and when will it be publicly available?

 The MIDI 2.0 specifications have been designed to add numerous new options to MIDI while always keeping priority on a high level of backward compatibility with existing MIDI devices.

MIDI 1.0 was a monologue, MIDI 2.0 is a dialog. Devices can talk to each other and agree on features that both devices support. This fundamental paradigm shift has opened a whole new world of possibilities.

Profile Configuration and Property Exchange are new options to bring increased auto-configuration. MIDI systems will be easier to use when devices learn about each other and self-configure their connections.

Extended MIDI messages available in MIDI 2.0 will increase musical expression with greatly improved per-note control. Controllers and other parameters operate with far higher resolution.

Optional Jitter Reduction Timestamps will allow much tighter timing of all messages, including notes and the tempo clock.

The core MIDI 2.0 specifications are nearing completion in the MIDI Manufacturers Association and the Association of Musical Electronics Industry. Member companies are testing specification designs with prototypes. When testing is complete and specifications are published, then manufacturers can release MIDI 2.0 products. If testing continues as expected at this time, the first MIDI 2.0 devices are likely to be released in 2020. Initially, some existing MIDI 1.0 products might be updated to add some MIDI 2.0 features. It will take several years for a wider range of MIDI 2.0 products to start coming to market.

MIDI 2.0 was designed to allow for future expansion so there are plenty of free opcodes for new MIDI specifications in the future.

This is the biggest update to MIDI in 36 years, but it also guarantees that MIDI will continue to expand and adapt to new technologies customer needs in the coming decades.

by Mike Kent (MIDI Manufacturers Association)

Brett, Can you tell us more about what to expect from the MIDI 2.0 talk at ADC’19?

When I joined the group prototyping MIDI 2.0 (before it was even officially called that), I was in the fortunate position where I didn’t need to try to get the new protocol working inside of my company’s existing product, because Art+Logic doesn’t have any products of our own. Instead, I decided to build the kind of tool for development and testing that I would hope existed by the time someone drops a MIDI 2.0 project on my desk. At ADC I’ll talk about the tool now known at ‘MIDI 2.0 Scope’ and the JUCE-friendly MIDI 2.0 message classes I created for it. It’s already gotten a fair bit of use from the other developers who’ve been part of the early prototyping efforts.

We’ll also talk about the upcoming MIDI 2.0 conformance testing application that I’m working on now.

by Brett Porter (Art+Logic):


Introducing MIDI 2.0 Presentation

What is MIDI 2.0, anyway? That is the mission of this session. We’ll explain the current state of MIDI 2.0 specifications, and provide new detail of specifications to be completed soon. This will include brief reviews of MIDI-CI, Profile Configuration and Property Exchange. The focus will be the new MIDI 2.0 Protocol, with some details of the MIDI 2.0 packet and message designs and how MIDI-CI is used to achieve maximum interoperability with MIDI 1.0 and MIDI 2.0 devices. There will be little time for Q&A, but we’d love to talk shop with you at the MIDI table in the main hall.

The presenters are key architects of the MIDI 2.0 specifications in the MIDI Manufacturers Association.

Speakers:

Mike Kent: MIDI Manufacturers Association

Brett Porter: Lead Engineer, Art+Logic

Brett holds a B.M. in Composition and M.M. in Electronic/Computer Music from the University of Miami Frost School of Music. At Art+Logic since 1997, he’s worked on custom software development projects of all kinds but prefers to focus on the pro audio and MI world.

Florian Bomers:Founder, Bome Software

Will translate MIDI for food. Florian Bömers has been using MIDI since the mid-80s and started programming audio and MIDI applications already in his childhood. Now he manages his company Bome Software, which creates standard software and hardware solutions for MIDI translation. Florian actively participates in standardization groups of the MIDI Manufacturers Association and is a member of its Technical Standards Board. 

When:

Tuesday November 19, 2019 16:00 – 16:50

Where:

CodeNode10 South Pl, London EC2M 7EB Track 2 


Support for MIDI 2.0 and MIDI-CI in VST 3 Presentation

The recent extensions of the MIDI standard, namely MIDI 2 and MIDI CI (Capability Inquiry), generate many opportunities to develop hardware- and software-products, that excel previous products in terms of accuracy, expressiveness and convenience. While things should become easier for the users, the complexity of supporting MIDI as a developer will be significantly increased. In this presentation we will give a brief overview over these new MIDI extensions to then discuss, how these changes are reflected in the VST3 SDK and what plugin-developers need to do to make use of these new opportunities. Fortunately, many of these new capabilities can be supported with little to no effort, due to the design principles and features of VST3 which will be discussed, also. We may also briefly touch questions regarding support of these new MIDI capabilities from the perspective of hosting VST3 plugins. The presentation will start with giving short overviews over MIDI 2, MIDI-CI and VST3, to then dive into each specific MIDI extension to put it into context of the related concepts in VST3. This we will start with MIDI 2 – Per Note Controllers & VST3 – Note Expression, then we’ll look into MIDI 2 – Pitch handling methods and compare it to VST3. After that several further areas like

  • MIDI 2 – increased resolution
  • MIDI 2 – Channel groups
  • MIDI-CI – Program Lists
  • MIDI-CI – Recall State

will be put in context with VST3.

The presentation will be held by two senior developers of Steinberg, that have many years of experience in supporting and contributing to VST and in supporting MIDI inside the software products of Steinberg, especially Cubase and Nuendo. 

Speakers

Janne Roeper:Chief Development Engineer, Steinberg Media Technologies GmbH

My interests are making music together with other musicians in realtime, music technology, especially expressive MIDI controllers, programming, composing, yoga, meditation, piano, keyboards, drums, bass and other instruments, agile methodologies, computers and technology in general… Read More →

Arne SchefflerSoftware Developer, Steinberg Media Technologies GmbHI’m working at Steinberg for 20 years now and use Cubase since 30 years. I’m the maintainer and main contributor of the open source VSTGUI framework. If you want to know anything about VSTGUI, Cubase or Steinberg talk to me. 

When: Wednesday November 20, 2019 10:30 – 11:20

Where: CodeNode 10 South Pl, London EC2M 7EB Track 3 


Other MIDI related events at ADC 


Tickets for ADC on Eventbrite 


...

Audio Developer Conference 2019 Tickets, Mon 18 Nov 2019 at 08:00 | Eventbrite

Eventbrite – JUCE presents Audio Developer Conference 2019 – Monday, 18 November 2019 | Wednesday, 20 November 2019 at London, London, England. Find event and ticket information.

Developing MIDI applications on Android

About Android MIDI

In 2015 Android introduced MIDI support. Android has billions of users which means there’s a whole lot of people with MIDI compatible devices in their pocket!

In this article I’ll explore the most popular types of MIDI applications on Android and how to choose the right MIDI API when developing your own app.

Common types of MIDI app

 The two most popular types of MIDI app are:

1) Apps controlled by MIDI. These are apps which primarily receive MIDI data, such as synthesizers, drum machines and DJing apps.

2) MIDI controller apps. These apps are used to control other apps, external hardware or external software (such as a desktop audio workstation).

Apps controlled by MIDI

These apps receive MIDI data from hardware connected over Bluetooth or USB, or from a virtual MIDI service running on the phone itself.

The virtual analog synthesizer app, DRC can be played using an external MIDI keyboard  

 MIDI controller apps

These types of apps are designed to control other apps by sending MIDI to them. Examples include:

  • Using the phone’s touch screen as an X-Y pad which sends MIDI Control Change messages
  • Using the phone’s accelerometer to send MIDI pitch bend information
  • Step sequencing and piano roll apps

Super MIDI Box is a MIDI drum machine and step sequencer 

These types of apps are especially useful on Android because of how easy it is to use Android as a USB-MIDI device.

Android as a USB-MIDI device

A little-known feature of Android is that when connected to a USB host (like a computer) it can become a class compliant USB-MIDI device. This is done in the USB settings.  

Once the USB mode is changed to MIDI, it will appear on the host as a MIDI device with one input channel and one output channel. A MIDI controller app can then be used to control software, such as a Digital Audio Workstation (DAW), on the host.

Inside the app the MIDI device will show up as “Android USB Peripheral” with a single input port and a single output port.

DRC app connected to the Android USB Peripheral MIDI device

Android’s MIDI APIs 

 In order to develop a MIDI application on Android you’ll need to use one or both of the available MIDI APIs. Here’s a summary of the available APIs and their features:

JVM Midi

 In Android Marshmallow (API 23) the JVM Midi API was released. It can be accessed using Kotlin or the Java Programming Language. It enables you to enumerate, connect and communicate with MIDI devices over USB, Bluetooth LE and other apps advertising as virtual MIDI services.

Pros

  • Available on over 60% of all Android devices (correct as of July 10th 2019)

Cons

  • Sending MIDI messages to C/C++ code requires use of JNI which can introduce complexity due to multi-threading
  • MIDI data only available through callback mechanism

You must use this API for enumerating and connecting to MIDI devices, however, for reading and writing you can choose between this API and the Native Midi API described below.

Native Midi

In Android Q the Native MIDI API was released. It’s specifically designed for communicating with MIDI devices inside an audio callback which makes it easy to control audio rendering objects, for example a software synthesizer.

This API only handles sending and receiving MIDI data, not MIDI device enumeration or connection. That still needs to be done using the JVM API.

Pros

  • Easy to integrate with existing C and C++ code
  • Best possible latency
  • MIDI data can be received and sent inside an audio callback without blocking

Cons

  • Still need to enumerate and connect to MIDI devices in Kotlin or the Java Programming Language using the JVM MIDI API
  • Only available on Android Q and above 

Getting Started 

You can start developing a MIDI app using Android Studio. You’ll want to check out the following resources:

You should also take a look at the existing MIDI apps on the Google Play Store.

If you are using audio in your app you might want to check out the Oboe library for developing high performance audio apps.

If you have questions about developing MIDI apps on Android feel free to ask them on the android-midi group. Good luck and have fun!

The MIDI Manufacturers Association (MMA) and the Association of Music Electronics Industry (AMEI) announce MIDI 2.0™ Prototyping

For more details on MIDI 2.0, see this article on the site. 


...

Details about MIDI 2.0™, MIDI-CI, Profiles and Property Exchange –  

Core MIDI 2.0 specifications now available. Click here to download You must be logged in as a TMA member to download the spec. Clicking on the link will take you to the signup page to either create an account or log in.&nbsp;On Sunday, January 19, 20

FOR IMMEDIATE RELEASE

The MIDI Manufacturers Association (MMA) and the Association of Music Electronics Industry (AMEI) announce MIDI 2.0TM Prototyping

Los Angeles, CA, January 18, 2019 – The MIDI Manufacturers Association (MMA) and AMEI (the Japanese MIDI association) have finalized the core features and name for the next generation MIDI protocol: MIDI 2.0. Member companies are currently working together to develop prototypes based on a jointly developed, feature-complete, draft specification. A members-only plugfest to test compatibility between some early MIDI 2.0 prototypes is planned for Winter NAMM 2019. Participating companies include Ableton/Cycling ’74, Art+Logic, Bome Software, Google, imitone, Native Instruments, Roland, ROLI, Steinberg, TouchKeys, and Yamaha.

As with MIDI 1.0, AMEI and the MMA are working closely together and sharing code to streamline the prototype development process. Prototyping is planned to continue during 2019 as the associations work together on MIDI 2.0 launch plans, including exploring the development of a MIDI 2.0 logo and self-certification program for MMA and AMEI member companies.

During the prototyping phase, the proposed MIDI 2.0 specification is available only to MMA and AMEI members, because the prototyping process may trigger minor enhancements to the specification. Once a final specification is adopted, it will join the current MIDI specifications as a free download on www.midi.org.

The MIDI 2.0 initiative updates MIDI with auto-configuration, new DAW/web integrations, extended resolution, increased expressiveness, and tighter timing — all while maintaining a high priority on backward compatibility. This major update of MIDI paves the way for a new generation of advanced interconnected MIDI devices, while still preserving interoperability with the millions of existing MIDI 1.0 devices. One of the core goals of the MIDI 2.0 initiative is to also enhance the MIDI 1.0 feature set whenever possible.

All companies that develop MIDI products are encouraged to join the MMA to participate in the future development of the specification, and to keep abreast of other developments in MIDI technology.

About the MIDI Manufacturers Association (MMA)

The MIDI Manufacturers Association is an international group of hardware and software companies working together to develop new MIDI specifications and promote MIDI technology. For more information on the MMA corporate membership, please visit www.midi.org/about-the-mma.

The MMA also supports The MIDI Association, a growing, global community of over 18,000 people who work, play, and create with MIDI technology at www.midi.org. To stay up to date with the latest MIDI news and stories, and to access current MIDI specifications and receive the MIDI Association’s MIDI Message newsletter, sign up for free at: www.midi.org/midi-signup.

Press Contact: Tom White, press@midi.org 

MIDI Capability Inquiry Presentation at ADC

New MIDI Possibilities Outlined at ADC 2017

At the Audio Developers Conference in London, Ben Supper from Roli outlined a new MIDI proposal called MIDI Capability Inquiry ( MIDI-CI) which outlines a path to future expansions of MIDI.  

Almost exactly a year ago AMEI (the Association of Music Electronic Industries and the Japanese MIDI standards organization) announced that they had established a working group to look at new possibilities for future MIDI expansions. The initial ideas for what AMEI calls MIDI-CI were developed by Mike Kent of MK2image working as a consultant for Yamaha.  Mike worked for many years for Roland R&D where he helped craft the USB-MIDI specification that is still in use today.  He is also a former Chairman of the MIDI Manufacturers Association Technical Standards Board (MMA TSB).

At the ADC meeting last week, Ben Supper presented the basic outline of MIDI-CI  and then Amos Gaynes, current Chairman of the MMA TSB hosted a panel discussion on possible future MIDI expansions. On the panel, were Tom White, MMA president, Jean Baptiste Thiebault from Roli (an MMA Exec Board member), Phil Burk from Google (an MMA Exec Board member), Florian Bomer from Bome (an MMA TSB member). 

Here is a non-technical overview of MIDI-CI. 


The Three P’s of Capability Inquiry

 

MIDI-CI features three main capabilities: 

Profile Configuration, Property Exchange, and Protocol Negotiation

Profile Configuration

Profiles allow both current MIDI 1.0 devices and next-generation devices to auto-configure themselves (think MIDI learn on steroids).  A good example is drawbar organs.  For example, Yamaha has three different drawbar organs currently on the market- reface YC, a hardware piece, YC-3B Yamaha soft synth and the Steinberg Model C soft synth. They all have nine drawbars, Leslie effect and yet all three use different CC numbers for control of the drawbars and effects. A Profile would standardize the controllers used so there would be much greater interoperability between devices. 

This is from the proposed specification.

“Profiles define specific implementations of a set of MIDI messages chosen to suit a particular instrument, device type, or to accomplish a particular task. Two devices that conform to the same Profile will have generally have greater interoperability between them than devices using MIDI without Profiles. Profiles increase interoperability and ease of use while lowering the need for manual configuration of devices by users.”

Parameter Exchange

Property Exchange is used to Inquire, Get, and Set many properties including but not limited to device configuration settings, a list of controllers and resolution, a list of patches with names and other metadata, manufacturer, model number, and version. It allows transmission of human-readable text for patch names and metadata.

Parameter Exchange would allow a standardized way for getting a list of patch names and metadata form a hardware synth and store it in a DAW. Parameter Exchange has a tremendous amount of power and again will work with current MIDI 1.0.

Protocol Negotiation

The final part of MIDI-CI is Protocol Negotiation. This defines how to start in MIDI 1.0 and negotiate to a new protocol. So MIDI-CI clarifies how MIDI 1.0 devices and next-generation devices can live together in one system.

MIDI-CI itself does not define the new protocol just the way to negotiate to it. 

However, for the past year, the MMA has been looking at brand new next-generation protocols that are close to the current MIDI 1.0 specification but allow for expansion of channels, resolution, and expressiveness.

The next-gen protocol is planned as an extension of MIDI 1.0. MIDI 1.0 architectural concepts and semantics remain the same as MIDI 1.0 and compatibility for translation to/from MIDI 1.0 is given high priority in the design. 

Some of the features that are under consideration are: 

  • Per Note Controllers and Per Note Pitch Bend
  • More channels
  • More controllers
  • Higher resolution for controllers
  • Simplification of NRPN/RPN (atomic messages)
  • Simplification Bank and program changes (atomic messages)
  • Articulation messages 
  • Expanded tuning capabilities

The goal of the next-gen protocol proposal is to extend MIDI and make it as easy as possible to translate back to MIDI 1.0 so that MIDI 1.0 and next-generation devices can live together in the same ecosystem. 

Of course as to exactly when the MMA adopts MIDI-CI is not a question that can be answered.  However although the initial ideas for Capability Inquiry were developed by Yamaha, Roland, and Korg, there are over 35 individual members in the MMA CI working group who have been working diligently (and meeting weekly) over the past year to provide feedback and ideas to improve the CI specification.  This close cooperation and communication between MMA and AMEI should make it much smoother to get to a final specification and adoption. 

What’s even more exciting is that MIDI-CI opens the door to a whole new world of possibilities with Profile Configuration, Property Exchange and Protocol Negotiation. 

That is really the most important point. The MMA is currently looking at some of the most significant MIDI improvements in 33 years and those improvements will spur further MIDI innovations for years to come.

If you are a manufacturer or software developer, you should join the MMA  and get involved NOW to have input and influence the future direction of MIDI. 

If you are an individual who uses MIDI, please join The MIDI Association (becoming a member is free) and you can express your opinions on The MIDI Association forums and join live chats to discuss what’s coming next.

MIDI Polyphonic Expression (MPE) Specification Adopted!

One of the biggest recent developments in MIDI is MIDI Polyphonic Expression (MPE). MPE is a method of using MIDI which enables multidimensional controllers to control multiple parameters of every note within MPE-compatible software. 

In normal MIDI, Channel-wide messages (such as Pitch Bend) are applied to all notes being played on a single MIDI Channel. In MPE, each note is assigned its own MIDI Channel so that those messages can be applied to each note individually.

The newly adopted MIDI Polyphonic Expression (MPE) specification is now available for download by MIDI Association members.



Update Feb. 5 2019 -ROLI launches MPE website 

The ROLI Seaboard GRAND, Seaboard RISE, and BLOCKS – including the Seaboard Block and Lightpad Block – send standard MIDI messages and can be used with any software or hardware instrument that responds to these messages.

To take advantage of the five dimensions of touch the GRAND, RISE, and BLOCKS use MPE – MIDI Polyphonic Expression. This page lists MPE-compatible synthesizers and links to guides on using them with the Seaboard RISE, Seaboard GRAND, and BLOCKS. 

Check out all the information that ROLI has assembled on over 100 products from different companies that support MIDI Polyphonic Expression. 


Update Sept. 14, 2020 -KVR launches MPE forum



MIDI Manufacturers Association (MMA) Adopts New MIDI Polyphonic Expression (MPE) Enhancement to the MIDI Specification 

 

Los Angeles, CA, January 28, 2018− Today marks the MIDI Manufacturers Association’s (MMA) ratification of a new extension to MIDI, MPE (MIDI Polyphonic Expression). MPE enables electronic instruments such as synthesizers to provide a level of expressiveness typically possible only with acoustic instruments.

Prior to MPE, expressive gestures on synthesizers—such as pitch bending or adding vibrato—affected all notes being played. With MPE, every note a musician plays can be articulated individually for much greater expressiveness.

In MPE, each note is assigned its own MIDI Channel, so that Channel-wide expression messages can be applied to each note individually. Music making products (such as the ROLI Seaboard, Moog’s Animoog, and Apple’s Logic) take advantage of this so that musicians can apply multiple dimensions of finger movement control: left and right, forward and back, downward pressure, and more.

MMA President Tom White notes that “The efforts of the members (companies) of MMA has resulted in a specification for Polyphonic Expression that provides for interoperability among products from different manufacturers, and benefits the entire music industry.”

Jean-Baptiste Thiebaut of ROLI concurs. “The MPE specification paves the way for a new generation of expressive controllers and music software, providing many creative opportunities for live musicians and producers. MPE remains fully compatible with MIDI.”

The MPE specification will be available for download in the coming weeks. To obtain a free copy, join the MIDI Association, the global community of people who work, play and create with MIDI, at www.MIDI.org. 

MPE Press Release Downloadable PDF

The Basic Features of MPE

(Reprinted from the MIDI Manufacturers Association MPE Specification document’s Background Section)
(Note: not all devices may support all features)

The MPE specification aims to provide an agreed method for hardware and software manufacturers to communicate multidimensional control data between MIDI controllers, synthesizers, digital audio workstations, and other products, using the existing framework of MIDI 1.0.

These proposed conventions define a way of distributing polyphonic music over a group of MIDI Channels, making multiple parameters of different notes separately controllable. This will enable richer communication between increasingly expressive MIDI hardware and software.

Briefly, what is defined is as follows:

  — Wherever possible, every sounding note is temporarily assigned its own MIDI Channel between its Note On and Note Off. This allows Control Change and Pitch Bend messages to be addressed to that particular note.

  — A Registered Parameter Number is used to establish the range of Channels used for sending or receiving notes. Two messages control the division of MIDI Channel space into sub-spaces called Zones, so that multi-timbral playing is still possible using only one physical MIDI interface.

  — When there are more active notes in a Zone than available Channels, two or more notes will have to share the same Channel. Under such circumstances, all notes will continue to sound, but will no longer be uniquely controllable.

  — Each Zone has a dedicated extra Channel, called the Master Channel, which conveys common information including Program Change messages, pedal data, and overall Pitch Bend. These messages apply across the entire Zone.

(The MPE specification also defines how to handle Pitch Bend, Aftertouch and CC messages to provide maximum interoperability.)

The full MPE MIDI specification is available for download in the Specs section of the site. 


MPE Live Chat 

On May 26, 2018, we held the very first MIDI Live! chat with a panel of MPE specialists.

We recorded the session and it is presented here as a podcast.

Listeners were not only able to send in questions via text but were able to actually join the discussion and interact directly with the panelists. Roger Linn demoed his Linnstrument live from his studio in Los Altos.

DIscussions included the differences between the original MPE spec and the final MMA specification, MPE checklists, and test sequences, and the requirements for obtaining an MMA MPE logo that is under development.


...

MPE MIDI Live! Chat Podcast –  

On May 26, we held the very MIDI Live! chat with a panel of MPE specialists. We recorded the session and it is presented here as a podcast.

1


Here is a collection of Youtube videos showing off how expressive MPE enabled instruments can be. 












Links to online MPE resources


...

Multidimensional Polyphonic Expression – Synthtopia

Posts about Multidimensional Polyphonic Expression written by synthhead, Elisabeth, and Darwin Grosse


...

Bitwig | The Future of MIDI

Bitwig Studio is a multi-platform music-creation system for production, performance and DJing, with a focus on flexible editing tools and a super-fast workflow.

MIDI Enhancements in Windows 10

Recently Pete Brown from Microsoft posted a very informative blog post about MIDI Enhancements in Windows 10. 

The blog post covers a number of topics including 

  • UWP MIDI Basics – using MIDI in Windows Store apps
  • New Bluetooth LE MIDI support in Windows 10 Anniversary Update
  • The Win32 wrapper for UWP MIDI (making the API accessible to desktop apps)
  • MIDI Helper libraries for C# and PowerShell


“We’re happy to see Microsoft supporting the Bluetooth MIDI spec and exposing it to Windows developers through a simplified API. Using the new Win32 wrapper for the UWP MIDI API, we were able to prototype Bluetooth MIDI support very quickly.   At Cakewalk we’re looking ahead to support wireless peripherals, so this is a very welcome addition from Microsoft.”

by Noel Borthwick, CTO, Cakewalk

There is also a nice explanation in Pete’s article of how RPNs and NRPNs work and their data structure. 

With the release of Windows 10, all three of the major operating system companies (Apple, Google and Microsoft) have all developed support for the MIDI Manufacturers Association standard for MIDI over Bluetooth Low Energy that can be found in the specs section of this site and is available for download by MIDI Association members at no charge. 

Arduino MIDI Output Basics

Introduction

The Arduino UNO is a popular open-source microcontroller that, in many respects, is a perfect complement to the extensible nature of the Music Instrument Digital Interface (MIDI) protocol. Microcontroller platforms such as Arduino, Teensy, and others, make it relatively easy to develop custom MIDI controllers that respond to light, pressure, sound, and many other forms of input. In this way, microcontrollers provide a unique way to expand the possibilities of MIDI into the physical realm. MIDI musicians can now envision and create an astounding array of custom hardware devices from custom controllers to algorithmic composers and beyond.

Note that this article focuses on the basics of MIDI output on an Arduino UNO. Future articles will cover MIDI input on the Arduino and Teensy platforms as well as the use of potentiometers, switches, and other components for real-time MIDI control.

Transmitter Circuit

It is necessary to utilize a MIDI interface in order to send MIDI messages from an Arduino to a synthesizer, Digital Audio Workstation, or other MIDI device. Fortunately, it is easy (and inexpensive) to create a simple circuit that can handle MIDI output. The circuit can be mocked up on a solderless breadboard for experimentation or a permanent version can be soldered to solderboard. Users who are new to electronics might want to consider a commercial version such as the SparkFun MIDI Shield, offered at a nominal cost by SparkFun Electronics and other vendors.

As is evident in Figure 1, a circuit that was documented in David Miles Huber’s The MIDI Manual, the circuit for MIDI output is relatively simple and consists of:

  • a connection from the 5V pin of an Arduino through a 220-ohm resistor to pin 4 of a standard MIDI DIN jack,
  • a connection from the GND pin of an Arduino to pin 2 of a MIDI DIN jack,
  • a connection from the TX pin of an Arduino through a 220-ohm resister and 7404 Hex inverter to pin 5 of a MIDI DIN jack. 

Figure 2 demonstrates one way that the transmitter circuit could be configured on a solderless breadboard. Note that the top rail of the solderless breadboard is connected to the 5V pin on the Arduino and the bottom rail is connected to the Arduino GND pin.

MIDI Output Sketch: “Old School” Approach

While it is generally more convenient to use a MIDI library to program MIDI sketches on an Arduino, we will start with a low-level “pure” sketch in order to demonstrate how MIDI bytes are handled. If you have ever programmed MIDI applications for Windows, OS X, or Linux you are in for a pleasant surprise because MIDI output can be achieved with just a few lines of code on an Arduino. If you haven’t done so already, be sure to download the Arduino Software (Integrated Development Environment) from https://www.arduino.cc/en/Main/Software. Next, run the Arduino software and select File…New and enter the code that is described in the following paragraphs.

Boilerplate Code

While the basics of C and C++ programming are beyond the scope of this article (and covered in detail in my own Arduino for Musicians as well as numerous other books and online resources), rest assured that the basics of coding a simple MIDI sketch are not unduly difficult. Start by typing the functions shown in Listing 1 which form the basis for all Arduino sketches. Note that the term function is used to describe a block of “functional” code denoted by the function name and opening and closing braces:

Listing 1 Boilerplate functions


void setup()
{

}

void loop()
{

}

The setup() function is called once when your sketch is first run on an Arduino. You will use that block of code (between the opening and closing braces) to establish the serial transmission rate and any other initialization required by the sketch. The loop() function is where the action happens. As the name of the function implies, the loop() function continues to loop throughout the duration of your sketch unless you pause it with a delay() function or some other blocking activity.

Establishing a serial connection

To establish a serial MIDI connection between the Arduino and a MIDI receiver, add the code shown in Listing 2 to the setup() function. The Serial object represents a class (an object or pre-programmed chunk of code) that handles all of the low-level details of establishing and maintaining a serial connection. Note that the Serial class provides a function (typically called a method in the context of a class) titled begin() that takes the baud rate as a parameter. In this example, serial transmission is set to 31250 baud, the expected MIDI transmission rate as per The Complete MIDI 1.0 Detailed Specification (available from the MIDI Association at midi.org).

Listing 2 Setting up a serial connection


void setup()
{
Serial.begin(31250);
}

Writing a MIDI output function

Although there is nothing wrong with writing code for sending MIDI data in the loop() function, custom functions can help to produce code that is extensible and easier to read and maintain. Listing 3 demonstrates one approach to sending Note-On messages. Notice how the function takes three bytes that correspond to the MIDI channel, note, and velocity. The only tricky part of the code is the first line which translates the expected MIDI channel range of 1-16 to the range of Note-On status bytes starting at 0x90 (hexadecimal). The Serial.write() method is used to transmit the status byte and data bytes that form a Note-On message:

Listing 3 Custom function for outputting MIDI Note-On messages


void playMIDINote(byte channel, byte note, byte velocity)
{
//MIDI channels 1-16 are really 0-15
byte noteOnStatus = 0x90 + (channel-1);

//Transmit a Note-On message
Serial.write(noteOnStatus);
Serial.write(note);
Serial.write(velocity);
}

Outputting Notes

Now that a convenience function is available to handle the transmission of Note-On messages, it is easy to fill in some simple code in the loop() function to output a series of notes. Note that this example uses a blocking delay—generally a bad idea for more robust applications—but the use of timers is beyond the scope of this article and would only serve to obfuscate the underlying concept of sending MIDI data via a serial connection. In Listing 4, a “for loop” is used to output MIDI Note-On messages in the range of 60 to 72. The function delays and then the transmits the same note with a velocity of zero—which is functionally equivalent to sending a corresponding Note-Off message.

Listing 4 Outputting a chromatic scale


void loop()
{
//Play a chromatic scale starting on middle C (60)
for(int note = 60; note < 72; note++)
{
//Play a note
playMIDINote(1, note, 100);
//Hold the note for 60 ms (delay() used for simplicity)
delay(60);

//Turn note off (velocity = 0)
playMIDINote(1, note, 0);
//Pause for 60 ms
delay(60);
}
}

Uploading a Sketch to the Arduino

The complete sketch is shown in Listing 5. Once you have typed or copied the code into the Arduino Integrated Development Environment (IDE), click the leftmost check button to ensure that the sketch is free from errors. If you are relatively new to programming it might be helpful to remember that C code is case sensitive. It is also easy to omit an opening or closing brace or semicolon which can create any number of error messages. A final step is to connect the Arduino to your computer via a USB cable and select the upload button to upload the code to the Arduino. Assuming you have connected a valid MIDI output circuit, the chromatic scale should be received by any MIDI receiver device that is connected to the circuit via a MIDI cable.

Listing 5 Complete listing


void setup()
{
//Set up serial output with standard MIDI baud rate
Serial.begin(31250);

}

void loop()
{
//Play a chromatic scale starting on middle C (60)
for(int note = 60; note < 72; note++)
{
//Play a note
playMIDINote(1, note, 100);
//Hold note for 60 ms (delay() used for simplicity)
delay(60);

//Turn note off (velocity = 0)
playMIDINote(1, note, 0);
//Pause for 60 ms
delay(60);
}
}

void playMIDINote(byte channel, byte note, byte velocity)
{
//MIDI channels 1-16 are really 0-15
byte noteOnStatus=0x90 + (channel-1);

//Send notes to MIDI output:
Serial.write(noteOnStatus);
Serial.write(note);
Serial.write(velocity);
}

Coding Challenge

That’s it—Arduino MIDI output can be achieved with just a few lines of code! Consider how you might use the boilerplate code in this example to develop a simple algorithmic generator (perhaps using the Arduino random() function) or a sketch that outputs chords, exotic scales, or drum beats.

Next Steps

Although this introduction is necessarily limited, it forms the basis for many exciting possibilities including algorithmic composition, automation, and real-time control. As you will see in future articles, a basic MIDI output circuit can also be used for useful applications such as using a potentiometer to send continuous controller messages to a DAW or a two-axis joystick as an expressive real-time controller. As noted in the introduction, detailed coverage of Arduino MIDI and audio concepts are provided in Arduino for Musicians: A Complete Guide to Arduino and Teensy Microcontrollers as well as other books online resources.

Happy coding!

Ask.Audio Article on MIDI Messages

Ask.Audio and Non Linear Educating

Ask.Audio is one of our favorite technology websites and has been a great partner to The MIDI Association. We have worked with Ask.Audio’s parent company, Non Linear Educating to provide extensive video training right here on the MIDI.org website in our courses section.  Here’s a brief description on what NonLinear Educating is all about. 

Nonlinear Educating is an adaptive technology company dedicated to improving the way the world learns. The combination of our powerful and modular video-courseware production & distribution platform and our extensive library of industry leading training courses, has granted us the opportunity to empower a variety of partners from a multitude of industries. The foundationally modular approach to our application infrastructure enables us to rapidly customize instances of our platform to meet the specific needs of our partners. We are agile adaptive and are committed to developing the most efficient and robust video-learning platform on the internet.

by  Non Linear Educating

 The MIDI Association collaborates with many of the top technology websites including Ask.Audio, Electronic Musician, Harmony Central, Hispasonic, Keyboard Magazine, SonicState, Sound On Sound and more by mutually sharing information and stories about MIDI. 

Joe Albano, a well known author on Ask.Audio recently put together a great article on MIDI messages. We have Ask.Audio’s permission to summarize the content of their articles and then include a link to the full article.

Fig 1 The wonderful world of MIDI

M.I.D.I.—Musical Instrument Digital Interface—shook up the industry when it was introduced in 1983, by separating the player’s performance from the sound of the instrument, and this powerful digital communication protocol has been going strong ever since.

by Joe Albano

Joe’s article covers the basics about the most common MIDI messages

Channel Voice Messages

The bulk of the musical performance data of a MIDI recording falls into the message category of “Channel Voice Messages” (I’m going to ignore the old-school “Channel” designation here). The 7 Voice Messages are:

• Note-On

• Note-Off

• Monophonic (Channel) Pressure/Aftertouch

• Polyphonic (Key) Pressure/Aftertouch

• PitchBend

• Program Change

• Control Change (or Continuous Controller) messages, a.k.a. CC messages, of which there 127 

Fig 3 Strings of continuous (streaming) MIDI messages

Below is a link to the full article on Ask.Audio’s website. 


...

Everything You Need To Know About MIDI Messages But Were Afraid To Ask : Ask.Audio

MIDI. There’s a lot of musicians and producers who don’t know how to use this protocol to improve their musical performances and add more expression to their in

Here is a link to our collection of MIDI and Audio curriculums developed in cooperation with Nonlinear Educating. 


...

Massive Online Courseware Library : The MIDI Manufacturers Association : NonLinear Educating

Nonlinear Educating is an adaptive technology company dedicated to improving the way the world learns. The combination of our powerful and modular video-courseware production & distribution platform and our extensive library of industry leading training courses, has granted us the opportunity to empower a variety of partners from a multitude of industries. The foundationally modular approach to our application infrastructure enables us to rapidly customize instances of our platform to meet the specific needs of our partners. We are agile adaptive and are committed to developing the most efficient and robust video-learning platform on the internet.

Check out our MIDI & Audio courses. Sign up at NLE to get access to hours of free preview videos or take it to the next level and get MIDI Certified.

Basics of USB-MIDI

USB MIDI 2.0 ADOPTED

MIDI 2.0 Progress Continues with Updated USB Specification –  

As computers have become central components in many MIDI systems, USB has become the most widely used protocol for transporting MIDI data. With the introduction of MIDI 2.0, the USB Implementers Forum’s USB MIDI 2.0 working group, headed by members o

 

 

 

USB and MIDI

 

MIDI has stayed relevant for over 30 years by adapting to the different ways that computers send information to and from external devices. MIDI can now be sent over 5 Pin DIN, Serial Ports, USB, Firewire, Ethernet, Bluetooth and more. But currently the most prevalent way to connect to computers, tablets and smartphones is USB. This article will cover the basics of USB-MIDI.

 

Why USB came about

 

In the early 1990’s, there were far too many types of connectors on computers. There were separate serial ports, parallel ports, keyboard and mouse connections, and joystick ports, It was hard for people to tell whether the peripheral they were buying would actually work with their computer.  So Compaq, Intel, Microsoft and NEC ( joined later by Hewlett-Packard, Lucent and Philips) formed the USB Implementers Forum, Inc, a non-profit corporation to publish the specifications and organise further development in USB. Similar to the MIDI Manufacturers Association, the USB-IF makes sure that there is interoperability between USB devices.

 

Goals of USB

 

The USB-IF had some clear goals when first developing the USB specification

  • Standardize connector types: There are now several different types of USB connectors, but they are all standardized by the USB-IF
  • Hot-swappable: USB devices can be safely plugged and unplugged as needed while the computer is running. So there is no need to reboot.
  • Plug and Play: USB devices are divided into functional types (Audio, Image, Human User Interface, Mass Storage) and then operating system software can automatically identify, configure, and load the appropriate device driver when a user connects a USB device.
  • High performance: USB offers low speed (1.5 Mbit/s), full speed (12 Mbit/s) and high speed (up to 480 Mbit/s) transfer rates that can support a variety of USB peripherals. USB 3.0 (SuperSpeed USB) achieves the throughput up to 5.0 Gbit/s.
  • Expandability: Up to 127 different peripheral devices may theoretically be connected to a single bus at one time

 

USB System Architecture

 

The basic USB system architecture is actually pretty simple and consists of the following main components:

  • A Host Computer, Smartphone or Tablet
  • One or more USB Devices
  • A physical bus represented by the USB Cable that links the devices with the host 

The Universal Serial Bus is a host controlled bus. All data transfers are initiated and controlled by the host and USB peripherals are slaves responding to host commands. So for  USB MIDI peripheral devices you need a computer, smartphone or tablet in the system to control and initiate USB communication.

 

USB Device Classes

 

USB devices are defined into specific functional classes, for example image, human interface devices (keyboard, mouse, joystick), mass storage, and audio. The operating system can then know what the devices is designed to do and automatically load what is called a class compliant driver for that type of devices. In 1999, the MIDI specification was developed by the USB-IF in cooperation with the MIDI Manufacturers Association and included in the Audio class of devices.  That is why sometimes when you connect a USB-MIDI peripheral, the OS will display a message that says USB-Audio devices connected.  As far as USB is concerned MIDI is an Audio Class Compliant device.

 

Class Compliant Drivers versus Manufacturer Specific Drivers

 

Class compliant drivers are convenient because you don’t have to download any external software.  But often manufacturer specific drivers provide added functionality. Let’s use Yamaha has an example.  Because data transfer on USb is much faster than 5 pin DIN it is possible to have multiple ports of MIDI (a port is a group of 16 MIDI channels) on a single USB cable. The dedicated Yamaha USB Driver provides for 8 ports of high speed USB, includes the names of all the devices that are compatible with the driver and has some routing capabilities. These features are only available if you download the driver from Yamaha’s website.  Also many audio interfaces are also MIDI interfaces and audio and MIDI travel over the USb cable.  So if you purchase a MIDI or Audio interface you should always check the product manual and manufacturer’s website to see if there is a dedicated USB driver for your product that provides added functionality. Often even if the manufacturer specific driver is available when connected to a device which don’t allow driver downloads into the operating system (for example iOS devices), the product will still work as a class compliant USB device.

 

Types of USB MIDI connectors

 

Over the years, USB has developed and there are now a number of different cable types and USB specifications. Let’s take a look at the different connectors.

 

 

 

 

Originally most desktop and laptops computers had the standard sized Type A USB connector. A standard USB cable has a Type A connector on one end to connect to the host and a Type B connector on the other end to connect to the peripheral device. This is still the most common cable to connect a MIDI instrument to a computer.

 

 

 

 

USB Type A host connector

 

 

 

 

Type B USB peripheral connector

 

 

The Type A connector has a pin that supplies power to external peripherals so you need to be carefully about trying to connect two hosts via a Type A to Type A cable. This can cause serious damage to your gear so consult the manufacturer and manual before attempting this.

 

The Type A connector is for host controllers (computers, smartphones, tablets and some digital musical instruments that act as hosts) and USB hubs. A USB hub is a device that expands a single (USB) port into several so that there are more ports available to connect devices to a host system.USB hubs are often built into equipment such as computers, computer keyboards, monitors, or printers. When a device has many USB ports, they all usually stem from one or two internal USB hubs rather than each port having independent USB circuitry. If you need more USB ports, there are also external hubs that you can buy. You need to check to see if your USB peripherals need to be powered by USB and if they do you may need a powered USB hub.

 

 

 

 

On many digital musical instruments you find two USB connectors – one Type A connector labeled To Device and one Type B labeled To Host .  The To Host is usually used to send MIDI, Audio or both Audio and MIDI to a computer, smartphone or tablet. If your digital music product sends both MIDI and Audio over USB, you will almost certainly need a manufacturer specific driver.

The To Device is usually used for USB Storage devices like Flash Thumb drives, but it can be used for other things depending on what the Host music product supports for device classes.

 

 

 

 

USB A-Type

 

Considered the standard and most common type of connector, A-style connectors are found on the PC or charger side of most cables. This flat, rectangular interface is held in place through friction. Durable enough for continuous connection but easy enough for users to connect and disconnect, this connector type is also available in micro variations.

 

USB B-Type

 

Type-B USBs were traditionally used with printer cables but, they’re now found on many popular models of Android smartphones and external hard drives. These USBs feature a square interface and are available as a Micro-USB B, USB Mini-b (5-pin), and USB Mini-b (4-pin).

 

USB C-Type

 

The newest type of connector on the market, Type-C is a one-size-fits-all solution. Developed to support devices with a smaller, thinner and lighter form factor. Type-C is slim enough for a smartphone or tablet, yet robust enough for a desktop computer. It also has the advantage of a reversible plug orientation and cable direction, eliminating the guesswork about which direction the connection goes.

 

The future of USB Connectivity

 

USB Type-C is designed as a one-size-fits-all solution for data transfer and power supply on any device. Featuring a smaller connector, Type-C fits into one multi-use port to simultaneously charge devices and transfer data and also offers backward compatibility to support previous USB standards (2.0, 3.0, and 3.1).

Type-C is quickly becoming the new standard for operating systems and hardware providers; Intel’s Thunderbolt recently switched to USB Type-C ports while enabling cross compatibility with USB 3.1. The new Apple MacBooks feature a Type-C port.

The USB-IF predicts that by 2019, all laptops, tablets, mobile phones, and other consumer electronics will be equipped with USB Type-C.

In the meantime, if you have a newer computer, you may need an adapter to connect your MIDI gear to your computer.

 

 

 

 


 

 

About MIDI-Part 1:Overview

MIDI (pronounced “mid-e”) is a technology that makes creating, playing, or just learning about music easier and more rewarding. Playing a musical instrument can provide a lifetime of enjoyment and friendship. Whether your goal is to play in a band, or you just want to perform privately in your home, or you want to develop your skills as a music composer or arranger, MIDI can help.

How Does MIDI Work?

There are many different kinds of devices that use MIDI, from cell phones to digital music instruments to personal computers. The one thing all MIDI devices have in common is that they speak the “language” of MIDI. This language describes the process of playing music in much the same manner as sheet music: there are MIDI Messages that describe what notes are to be played and for how long, as well as the tempo, which instruments are to be played, and at what relative volumes.

MIDI is not audio.

MIDI is not audio. So if someone says MIDI sounds bad, they really don’t understand how MIDI works. Imagine if you took sheet music of a work by Beethoven and handed it to someone who can read music, but has never played the violin. Then you  put in their hands a very cheap violin. The music would probably sound bad. Now take that same piece of sheet music and hand it to the first chair of a symphony orchestra playing a Stradivarius and it will sound wonderful. So MIDI depends on the quality of playback device and also also how well the description of the music fits that player.

MIDI is flexible

The fact that MIDI is a descriptive language provides tremendous flexibility.Because MIDI data is only performance instructions and not a digital version of a sound recording, it is actually possible to change the performance, whether that means changing just one note played incorrectly, or changing all of them to perform the song in an entirely new key or at a different tempo, or on different instruments.

MIDI data can be transmitted between MIDI-compatible musical instruments, or stored in a Standard MIDI File for later playback. In either case, the resulting performance will depend on how the receiving device interprets the performance instructions, just as it would in the case of a human performer reading sheet music. The ability to fix, change, add, remove, speed up or slow down any part of a musical performance is exactly why MIDI is so valuable for creating, playing and learning about music.

The Three Parts of MIDI

The original Musical Instrument Digital Interface (MIDI) specification defined a physical connector and message format for connecting devices and controlling them in “real time”. A few years later Standard MIDI Files were developed as a storage format so performance information could be recalled at a later date. The three parts of MIDI are often just referred to as “MIDI “, even though they are distinctly different parts with different characteristics.

1. The MIDI Messages – the software protocol

The MIDI Messages specification (or “MIDI Protocol”) is the most important part of MIDI. The protocol is made up of  the MIDI messages that describe the music. There are note messages that tell the MIDI devices what note to play, there are velocity messages that tell the MIDI device how loud to play the note, there are messages to define how bright, long or short a note will be.There are Program Change messages that tell the MIDI device what instrument to play.So by studying and understanding MIDI messages you can learn how to completely describe a piece of music digitally. Look for information about MIDI messages in the “Control” section of Resources.

2. The physical transports for MIDI

Though originally intended just for use with the MIDI DIN transport as a means to connect two keyboards, MIDI messages are now used inside computers and cell phones to generate music, and transported over any number of professional and consumer interfaces (USB, Bluetooth, FireWire, etc.) to a wide variety of MIDI-equipped devices.

There are many different Cables & Connectors that are used to transport MIDI data between devices. Look for specific information in the “Connect” section of Resources.

MIDI is not slow

The “MIDI DIN” transport causes some confusion because it has specific characteristics which some people associate as characteristics of “MIDI” — forgetting that the MIDI-DIN characteristics go away when using MIDI over other transports (and inside a computer). With computers a High Speed Serial, USB or FireWire connection is more common. USB MIDI is significantly faster than 5 pin DIN. Each transport has its own performance characteristics that might make some difference in specific applications, but in general the transport is the least important part of MIDI, as long as it allows you to connect all the devices you want use!

3. The file formats for MIDI files

The final part of MIDI is made up of the Standard MIDI Files (and variants), which are used to distribute music playable on MIDI players of both the hardware and software variety. All popular computer platforms can play MIDI files (*.mid) and there are thousands of web sites offering files for sale or even for free. Anyone can make a MIDI file using commercial (or free) software that is readily available, and many people do. Whether or not you like a specific MIDI file can depend on how well it was created, and how accurately your synthesizer plays the file… not all synthesizers are the same, and unless yours is similar to that of the file composer, what you hear may not be at all what he or she intended. Look in the “Create”section of Resources for information about how to create and use different MIDI file formats.

Even More MIDI

Many people today see MIDI as a way to accomplish something, rather than as a protocol, cable, or file format. For example, many musicians will say they “use MIDI”, “compose in MIDI” or “create MIDI parts”, which means they are sequencing MIDI events for playback via a synthesizer, rather than recording the audio that the synthesizer creates.

About MIDI-Part 2:MIDI Cables & Connectors

Part 2: MIDI Cables & Connectors

Many different “transports” can be used for MIDI messages. The speed of the transport determines how much MIDI data can be carried, and how quickly it will be received.

5-Pin MIDI DIN

Using a 5-pin “DIN” connector, the MIDI DIN transport was developed back in 1983, so it is slow compared to common high-speed digital transports available today, like USB, FireWire, and Ethernet. But MIDI-DIN is almost always still used on a lot of MIDI-equipped devices because it adequately handles communication speed for one device. Also, if you want to connect one MIDI device to another (without a computer), MIDI cables are usually needed.

USB and FireWire

Computers are most often equipped with USB and possibly FireWire connectors, and these are now the most common means of connecting MIDI devices to computers (using appropriate adapters). Adapters can be as simple as a short cable with USB or FireWire connectors on one end and MIDI DIN connectors on the other, or as complex as a 19 inch rack mountable processor with dozens of MIDI and Audio In and Out ports. The best part is that USB and FireWire are “plug-and-play” interfaces which means they generally configure themselves. In most cases, all you need to do is plug in your USB or FireWire MIDI interface and boot up some MIDI software and off you go.

With USB technology, devices must connect to a host (PC), so it is not possible to connect two USB MIDI devices to each other as it is with two MIDI DIN devices. (This could change sometime in the future with new versions of USB). USB-MIDI devices require a “driver” on the PC that knows how the device sends/receives MIDI messages over USB. Most devices follow a specification (“class”) that was defined by the USB-IF; Windows and Mac PCs already come with “class compliant” drivers for devices that follow the USB-IF MIDI specification. For more details, see the article on USB in the Connection area of Resources. 

Most FireWire MIDI devices also connect directly to a PC with a host device driver, and the host handles communication between FireWire MIDI devices even if they use different drivers. But FireWire also supports “peer-to-peer” connections, so MMA (along with the 1394TA) produced a specification for transport of MIDI over IEEE-1394 (FireWire), which is available for download on this site (and also part of the IEC-61883 international standard).

Ethernet & WiFi (LAN)

Many people have multiple MIDI instruments and one or more computers (or a desktop computer and a mobile device like an iPad), and would like to connect them all over a local area network (LAN). However, Ethernet and WiFi LANs do not always guarantee on-time delivery of MIDI messages, so MMA has been reluctant to endorse LANs as a recommended alternative to MIDI DIN, USB, and FireWire. That said, there are many LAN-based solutions for MIDI, the most popular being the RTP-MIDI specification which was developed at the IETF in cooperation with MMA Members and the MMA Technical Standards Board. In anticipation of increased use of LANs for audio/video in the future, MMA is also working on recommendations for transporting MIDI using new solutions like the IEEE-1722 Transport Protocol for Time-Sensitive Streams.

Bluetooth

Everything is becoming “mobile”, and music creation is no exception. There are hundreds of music-making software applications for tablets and smart phones, many of which are equipped with Bluetooth “LE” (aka “Smart”) wireless connections. Though Bluetooth technology is similar to WiFi in that it can not always guarantee timely delivery of MIDI data, in some devices Bluetooth takes less battery power to operate than WiFi, and in most cases will be less likely to encounter interference from other devices (because Bluetooth is designed for short distance communication). In 2015 the MMA adopted Bluetooth MIDI LE performance and developed a recommended practice (specification) for MIDI over Bluetooth.

Deprecated Solutions

Sound Cards

It used to be that connecting a MIDI device to a computer meant installing a “sound card” or “MIDI interface” in order to have a MIDI DIN connector on the computer. Because of space limitations, most such cards did not have actual 5-Pin DIN connectors on the card, but provided a special cable with 5-Pin DINs (In and Out) on one end (often connected to the “joystick port”). All such cards need “driver” software to make the MIDI connection work, but there are a few standards that companies follow, including “MPU-401” and “SoundBlaster”. Even with those standards, however, making MIDI work could be a major task. Over a number of years the components of the typical sound card and MIDI interface (including the joystick port) became standard on the motherboard of most PCs, but this did not make configuring them any easier.

Serial, Parallel, and Joystick Ports

Before USB and FireWire, personal computers were all generally equipped with serial, parallel, and (possibly) joystick ports, all of which have been used for connecting MIDI-equipped instruments (through special adapters). Though not always faster than MIDI-DIN, these connectors were already available on computers and that made them an economical alternative to add-on cards, with the added benefit that in general they already worked and did not need special configuration. The High Speed Serial Ports such as the “mini-DIN” ports available on early Macintosh computers support communication speeds roughly 20 times faster than MIDI-DIN, making it also possible for companies to develop and market “multiport” MIDI interfaces that allowed connecting multiple MIDI-DINs to one computer. In this manner it became possible to have the computer address many different MIDI-equipped devices at the same time. Recent multi-port MIDI interfaces use even faster USB or FireWire ports to connect to the computer.

Tutorial: MIDI and Music Synthesis

An explanation of music synthesis technology and how MIDI is used to generate and control sounds.

This document was originally published in 1995 at a time when MIDI had been used in electronic musical instruments for more than a decade, but was still a relatively new technology in the computer industry. All these years later, the PC industry has changed, and some of the explanations herein are now dated (such as references to “internal sound cards”, which have been replaced by external Analog to Digital converters). The explanations of MIDI technology and synthesis methods are still accurate.



File Name:
audio_midi
File Size:
3.2 mb

Download File

About MIDI-Part 4:MIDI Files

Standard MIDI Files (“SMF” or *.mid files)

 

Standard MIDI Files (“SMF” or *.mid files) are a popular source of music on the web, and for musicians performing in clubs who need a little extra accompaniment. The files contain all the MIDI instructions for notes, volumes, sounds, and even effects. The files are loaded into some form of ‘player’ (software or hardware), and the final sound is then produced by a sound-engine that is connected to or that forms part of the player.

One reason for the popularity of MIDI files is that, unlike digital audio files (.wav, .aiff, etc.) or even compact discs or cassettes, a MIDI file does not need to capture and store actual sounds. Instead, the MIDI file can be just a list of events which describe the specific steps that a soundcard or other playback device must take to generate ceratin sounds. This way, MIDI files are very much smaller than digital audio files, and the events are also editable, allowing the music to be rearranged, edited, even composed interactively, if desired.

All popular computer platforms can play MIDI files (*.mid) and there are thousands of web sites offering files for sale or even for free. Anyone can make a MIDI file using commercial (or free) software that is readily available, and many people do, with a wide variety of results.

Whether or not you like a specific MIDI file can depend on how well it was created, and how accurately your synthesizer plays the file… not all synthesizers are the same, and unless yours is similar to that of the file composer, what you hear may not be at all what he or she intended. General MIDI (GM) and GM2 both help address the issue of predictable playback from MIDI Files.

Formats 

The Standard MIDI File format is different from native MIDI protocol, because the events are time-stamped for playback in the proper sequence.

Standard MIDI Files come in two basic varieties: a Type 1 file, and a Type 0 file (a Type 2 was also specified originally but never really caught on, so we won’t spend any time discussing it here). In a Type 1 file individual parts are saved on different tracks within the sequence. In a Type 0 file everything is merged into a single track.  

Making SMFs

Musical performances are not usually created as SMFs; rather a composition is recorded using a sequencer such as Digital Performer, Cubase, Sonar etc. that saves MIDI data in it’s own format. However, most if not all sequencers a ‘Save As’ or ‘Export’ as a Standard MIDI File.

Compositions in SMF format can be created and played back using most DAW software (Cubase, Logic, Sonar, Performer, FL Studio, Ableton Live, GarageBand ( Type 1 SMF), and other MIDI software applications.  Many hardware products (digital pianos, synths and workstations) can also create and playback SMF files. Check the manual of the MIDI products you own to find out about their SMF capabilities. 

Setup Data

An SMF not only contains regular MIDI performance data – Channelized notes, lengths, pitch bend data etc – it also should have data (commonly referred to as a ‘header’) that contains additional set-up data (tempo, instrument selections per Channel, controller settings, etc.) as well as songinformation (copyright notices, composer, etc.).

How good, or true to its originally created state an SMF will sound can depend a lot on the header information. The header can exert control over the mix, effects, and even sound editing parameters in order to minimize inherent differences between one soundset and another. There is no standard set of data that you have to put in a header (indeed such data can also be placed in a spare ‘set-up’ bar in the body of the file itself) but generally speaking the more information you provide for the receiving sound device the more defined – and so, presumably, the more to your tastes – the results will be.

Depending upon the application you are using to create the file in the first place, header information may automatically be saved from within parameters set in the application, or may need to be manually placed in a ‘set-up’ bar before the music data commences.

Information that should be considered (per MIDI Channel) includes:

  • Bank Select (0=GM) / Program Change #
  • Reset All Controllers (not all devices may recognize this command so you may prefer to zero out or reset individual controllers)
  • Initial Volume (CC7) (standard level = 100)
  • Expression (CC11) (initial level set to 127)
  • Hold pedal (0 = off)
  • Pan (Center = 64)
  • Modulation (0)
  • Pitch bend range
  • Reverb (0 = off)
  • Chorus level (0 = off)

All files should also begin with a GM/GS/XG Reset message (if appropriate) and any other System Exclusive data that might be necessary to setup the target synthesizer. If RPNs or more detailed controller messages are being employed in the file these should also be reset or normalized in the header.

If you are inputting header data yourself it is advisable not to clump all such information together but rather space it out in intervals of 5-10 ticks. Certainly if a file is designed to be looped, having too much data play simultaneously will cause most playback devices to ‘choke, ‘ and throw off your timing. 

Download the MIDI 1.0 specification that includes Standard MIDI File details 


...

The Complete MIDI 1.0 Detailed Specification

THE MIDI ASSOCIATION, a global community of people who work, play and create with MIDI and the central repository of information about anything related to MIDI.

About MIDI-Part 3:MIDI Messages

Part 3: MIDI Messages

The MIDI Message specification (or “MIDI Protocol”) is probably the most important part of MIDI.

MIDI is a music description language in digital (binary) form. It was designed for use with keyboard-based musical instruments, so the message structure is oriented to performance events, such as picking a note and then striking it, or setting typical parameters available on electronic keyboards. For example, to sound a note in MIDI you send a “Note On” message, and then assign that note a “velocity”, which determines how loud it plays relative to other notes. You can also adjust the overall loudness of all the notes with a Channel Volume” message. Other MIDI messages include selecting which instrument sounds to use, stereo panning, and more.

The first specification (1983) did not define every possible “word” that can be spoken in MIDI , nor did it define every musical instruction that might be desired in an electronic performance. So over the past 20 or more years, companies have enhanced the original MIDI specification by defining additional performance control messages, and creating companion specifications which include:

  • MIDI Machine Control
  • MIDI Show Control
  • MIDI Time Code
  • General MIDI
  • Downloadable Sounds
  • Scalable Polyphony MIDI

Alternate Applications 

MIDI Machine Control and MIDI Show Control are interesting extensions because instead of addressing musical instruments they address studio recording equipment (tape decks etc) and theatrical control (lights, smoke machines, etc.).

MIDI is also being used for control of devices where standard messages have not been defined by MMA, such as with audio mixing console automation.

Different Kinds of MIDI Messages

A MIDI message is made up of an eight-bit status byte which is generally followed by one or two data bytes. There are a number of different types of MIDI messages. At the highest level, MIDI messages are classified as being either Channel Messages or System Messages. Channel messages are those which apply to a specific Channel, and the Channel number is included in the status byte for these messages. System messages are not Channel specific, and no Channel number is indicated in their status bytes.

Channel Messages may be further classified as being either Channel Voice Messages, or Mode Messages. Channel Voice Messages carry musical performance data, and these messages comprise most of the traffic in a typical MIDI data stream. Channel Mode messages affect the way a receiving instrument will respond to the Channel Voice messages.

Channel Voice Messages

Channel Voice Messages are used to send musical performance information. The messages in this category are the Note On, Note Off, Polyphonic Key Pressure, Channel Pressure, Pitch Bend Change, Program Change, and the Control Change messages.

Note On / Note Off / Velocity

In MIDI systems, the activation of a particular note and the release of the same note are considered as two separate events. When a key is pressed on a MIDI keyboard instrument or MIDI keyboard controller, the keyboard sends a Note On message on the MIDI OUT port. The keyboard may be set to transmit on any one of the sixteen logical MIDI channels, and the status byte for the Note On message will indicate the selected Channel number. The Note On status byte is followed by two data bytes, which specify key number (indicating which key was pressed) and velocity (how hard the key was pressed).

The key number is used in the receiving synthesizer to select which note should be played, and the velocity is normally used to control the amplitude of the note. When the key is released, the keyboard instrument or controller will send a Note Off message. The Note Off message also includes data bytes for the key number and for the velocity with which the key was released. The Note Off velocity information is normally ignored.

Aftertouch

Some MIDI keyboard instruments have the ability to sense the amount of pressure which is being applied to the keys while they are depressed. This pressure information, commonly called “aftertouch”, may be used to control some aspects of the sound produced by the synthesizer (vibrato, for example). If the keyboard has a pressure sensor for each key, then the resulting “polyphonic aftertouch” information would be sent in the form of Polyphonic Key Pressure messages. These messages include separate data bytes for key number and pressure amount. It is currently more common for keyboard instruments to sense only a single pressure level for the entire keyboard. This “Channel aftertouch” information is sent using the Channel Pressure message, which needs only one data byte to specify the pressure value.

Pitch Bend

The Pitch Bend Change message is normally sent from a keyboard instrument in response to changes in position of the pitch bend wheel. The pitch bend information is used to modify the pitch of sounds being played on a given Channel. The Pitch Bend message includes two data bytes to specify the pitch bend value. Two bytes are required to allow fine enough resolution to make pitch changes resulting from movement of the pitch bend wheel seem to occur in a continuous manner rather than in steps.

Program Change

The Program Change message is used to specify the type of instrument which should be used to play sounds on a given Channel. This message needs only one data byte which specifies the new program number.

Control Change

MIDI Control Change messages are used to control a wide variety of functions in a synthesizer. Control Change messages, like other MIDI Channel messages, should only affect the Channel number indicated in the status byte. The Control Change status byte is followed by one data byte indicating the “controller number”, and a second byte which specifies the “control value”. The controller number identifies which function of the synthesizer is to be controlled by the message. A complete list of assigned controllers is found in the MIDI 1.0 Detailed Specification.

– Bank Select

Controller number zero (with 32 as the LSB) is defined as the bank select. The bank select function is used in some synthesizers in conjunction with the MIDI Program Change message to expand the number of different instrument sounds which may be specified (the Program Change message alone allows selection of one of 128 possible program numbers). The additional sounds are selected by preceding the Program Change message with a Control Change message which specifies a new value for Controller zero and Controller 32, allowing 16,384 banks of 128 sound each.

Since the MIDI specification does not describe the manner in which a synthesizer’s banks are to be mapped to Bank Select messages, there is no standard way for a Bank Select message to select a specific synthesizer bank. Some manufacturers, such as Roland (with “GS”) and Yamaha (with “XG”) , have adopted their own practices to assure some standardization within their own product lines.

– RPN / NRPN

Controller number 6 (Data Entry), in conjunction with Controller numbers 96 (Data Increment), 97 (Data Decrement), 98 (Non-Registered Parameter Number LSB), 99 (Non-Registered Parameter Number MSB), 100 (Registered Parameter Number LSB), and 101 (Registered Parameter Number MSB), extend the number of controllers available via MIDI. Parameter data is transferred by first selecting the parameter number to be edited using controllers 98 and 99 or 100 and 101, and then adjusting the data value for that parameter using controller number 6, 96, or 97.

RPN and NRPN are typically used to send parameter data to a synthesizer in order to edit sound patches or other data. Registered parameters are those which have been assigned some particular function by the MIDI Manufacturers Association (MMA) and the Japan MIDI Standards Committee (JMSC). For example, there are Registered Parameter numbers assigned to control pitch bend sensitivity and master tuning for a synthesizer. Non-Registered parameters have not been assigned specific functions, and may be used for different functions by different manufacturers. Here again, Roland and Yamaha, among others, have adopted their own practices to assure some standardization.

Channel Mode Messages

Channel Mode messages (MIDI controller numbers 121 through 127) affect the way a synthesizer responds to MIDI data. Controller number 121 is used to reset all controllers. Controller number 122 is used to enable or disable Local Control (In a MIDI synthesizer which has it’s own keyboard, the functions of the keyboard controller and the synthesizer can be isolated by turning Local Control off). Controller numbers 124 through 127 are used to select between Omni Mode On or Off, and to select between the Mono Mode or Poly Mode of operation.

When Omni mode is On, the synthesizer will respond to incoming MIDI data on all channels. When Omni mode is Off, the synthesizer will only respond to MIDI messages on one Channel. When Poly mode is selected, incoming Note On messages are played polyphonically. This means that when multiple Note On messages are received, each note is assigned its own voice (subject to the number of voices available in the synthesizer). The result is that multiple notes are played at the same time. When Mono mode is selected, a single voice is assigned per MIDI Channel. This means that only one note can be played on a given Channel at a given time.

Most modern MIDI synthesizers will default to Omni On/Poly mode of operation. In this mode, the synthesizer will play note messages received on any MIDI Channel, and notes received on each Channel are played polyphonically. In the Omni Off/Poly mode of operation, the synthesizer will receive on a single Channel and play the notes received on this Channel polyphonically. This mode could be useful when several synthesizers are daisy-chained using MIDI THRU. In this case each synthesizer in the chain can be set to play one part (the MIDI data on one Channel), and ignore the information related to the other parts.

Note that a MIDI instrument has one MIDI Channel which is designated as its “Basic Channel”. The Basic Channel assignment may be hard-wired, or it may be selectable. Mode messages can only be received by an instrument on the Basic Channel.

System Messages

MIDI System Messages are classified as being System Common Messages, System Real Time Messages, or System Exclusive Messages. System Common messages are intended for all receivers in the system. System Real Time messages are used for synchronization between clock-based MIDI components. System Exclusive messages include a Manufacturer’s Identification (ID) code, and are used to transfer any number of data bytes in a format specified by the referenced manufacturer.

System Common Messages

The System Common Messages which are currently defined include MTC Quarter Frame, Song Select, Song Position Pointer, Tune Request, and End Of Exclusive (EOX). The MTC Quarter Frame message is part of the MIDI Time Code information used for synchronization of MIDI equipment and other equipment, such as audio or video tape machines.

The Song Select message is used with MIDI equipment, such as sequencers or drum machines, which can store and recall a number of different songs. The Song Position Pointer is used to set a sequencer to start playback of a song at some point other than at the beginning. The Song Position Pointer value is related to the number of MIDI clocks which would have elapsed between the beginning of the song and the desired point in the song. This message can only be used with equipment which recognizes MIDI System Real Time Messages (MIDI Sync).

The Tune Request message is generally used to request an analog synthesizer to retune its’ internal oscillators. This message is generally not needed with digital synthesizers.

The EOX message is used to flag the end of a System Exclusive message, which can include a variable number of data bytes.

System Real Time Messages

The MIDI System Real Time messages are used to synchronize all of the MIDI clock-based equipment within a system, such as sequencers and drum machines. Most of the System Real Time messages are normally ignored by keyboard instruments and synthesizers. To help ensure accurate timing, System Real Time messages are given priority over other messages, and these single-byte messages may occur anywhere in the data stream (a Real Time message may appear between the status byte and data byte of some other MIDI message).

The System Real Time messages are the Timing Clock, Start, Continue, Stop, Active Sensing, and the System Reset message. The Timing Clock message is the master clock which sets the tempo for playback of a sequence. The Timing Clock message is sent 24 times per quarter note. The Start, Continue, and Stop messages are used to control playback of the sequence.

The Active Sensing signal is used to help eliminate “stuck notes” which may occur if a MIDI cable is disconnected during playback of a MIDI sequence. Without Active Sensing, if a cable is disconnected during playback, then some notes may be left playing indefinitely because they have been activated by a Note On message, but the corresponding Note Off message will never be received.

The System Reset message, as the name implies, is used to reset and initialize any equipment which receives the message. This message is generally not sent automatically by transmitting devices, and must be initiated manually by a user.

System Exclusive Messages

System Exclusive messages may be used to send data such as patch parameters or sample data between MIDI devices. Manufacturers of MIDI equipment may define their own formats for System Exclusive data. Manufacturers are granted unique identification (ID) numbers by the MMA or the JMSC, and the manufacturer ID number is included as part of the System Exclusive message. The manufacturers ID is followed by any number of data bytes, and the data transmission is terminated with the EOX message. Manufacturers are required to publish the details of their System Exclusive data formats, and other manufacturers may freely utilize these formats, provided that they do not alter or utilize the format in a way which conflicts with the original manufacturers specifications.

Certain System Exclusive ID numbers are reserved for special protocols. Among these are the MIDI Sample Dump Standard, which is a System Exclusive data format defined in the MIDI specification for the transmission of sample data between MIDI devices, as well as MIDI Show Control and MIDI Machine Control.