I've looked through the Universal MIDI Packet and MIDI 2.0 protocol specification (M2-104-UM-1.0), and some issues came on my mind:
1. In Control Change message, the CC number ('index') field takes a different place comparing to Registered (RPN) and Assignable (NRPN) Controllers. For the CC message, the byte order is 'index reserved', while for RPN/NRPN messages it's 'bank index'.
Wouldn't the byte order of 'reserved index' allow CC message to fit better into the controller pattern, so that future revisions of the format could add controller banks if needed?
2. I understand that backward compatibility was given very high priority during development of MIDI 2.0, however the updated Control Change messages have essentially the same format and function as Registered (and Assignable) Controllers.
Would it make sense to just translate current Control Changes #0-127 into some reserved bank of Registered Controllers in a future revision of the message format?
3. It would be helpful to include a list of supported Control Changes in the spec, similarily to the list of Per-Note Controllers, detailing their behaviour and translations to MIDI 1.0.
Specifically, LSB versions (CC #32-61 and #88) are not really needed in the MIDI 2.0 protocol anymore, so these numbers should be reserved. Also these 14-bit MSB/LSB combinations could be mapped to new 32-bit Control Changes (and vice versa).
The current spec only details processing and translation rules for Data Entry controllers #6/38, #98/99 and #100/191, and Bank Select controllers #0/32.
I understand that very few MIDI devices ever used these hi-res controllers, but MIDI 2.0 Protocol devices could probably support them in MIDI 1.0 mode, especially when connected to the computer over high-speed UMP-compatible transports, so that older MIDI 1.0 software would still be able to take advantage of high resolution in MIDI 2.0 devices.
I believe such forward compatibility could be essential for broader adoption of MIDI 2.0.
Just for information, regarding your point 3 - look at a recent message regarding 'Separate channels for Drums'. There are other issues there, and a number of questions outstanding, but the referred to VST for drum pads appears to be using the CC in the range 32 etc. The poster there refers to #32 only, but I'd guess that maybe other numbers are involved, and I'm guessing that the device there may be using MSB/LSB data for a number of settings. But I'm awaiting more detail from OP.
Geoff
> 1. In Control Change message,...
> 2. ...just translate current Control Changes #0-127 into some reserved bank of Registered Controllers...
We considered and specifically decided against both of those proposals. Control Change is a unique message area with a lot of legacy oddities (Some are controllers, some are not, there are different resolutions, multi-message RPN & NRPN, mode messages, etc.). We determined that it is best to fix the set of Control Change messages and set it aside, as-is. The decision is debatable but this is where we ended up after weighing the options. Translation is simpler this way. We don't need to save the space. The Registered Controllers and Assignable Controllers provide a huge set of unassigned controllers. Furthermore MIDI 2.0 Protocol has immense room for expansion. For example, we could define a 128 bit message type with 16 bits for opcode/status and 32 bits for index.
> 3. It would be helpful to include a list of supported Control Changes in the spec,..
Sure. Perhaps we can add that as a table in a future revision.
> Specifically, LSB versions (CC #32-61 and #88) are not really needed in the MIDI 2.0 protocol anymore...
You are correct, they are not needed. However, many MIDI 1.0 devices allow these to be used as singular controllers. They are rarely actually used correctly as LSB. These are options we weighed and debated when we decided on a design that weighs in favor of treating them as singular controllers rather than LSB. This is also a factor in driving our specific choice to mention alternate translations. Translators that recognize that MSB and LSB are being used correctly may opt to translate 14 bits to 32 bits.
. . . . . . . . . .
We had some difficult decisions to make for backward compatibility and translations. Most often the design had one solution that was clearly the correct one. Sometimes we had to make judgement values and pick the design that the working group determined to be the best from among several viable solutions. The area of Control Change messages presented challenges. We encourage ongoing use of the most-often used standard CCs like volume, sustain, etc. But we hope future products will have user interfaces that encourage users to use the Assignable Controllers for other applications.
Mike.
Chair of MIDI 2.0 Working Group
Thank you for the details! I do understand the extent and complexity of compatibility issues. If the intent is to preserve common Control Changes as they are in the 2.0 Protocol, and encourage the use of Registered/Assignable controllers in future products, I suggest more guidance should be included in the specification, like the aforementioned table of supported and reserved Control Changes. And explicitly defined rules for translating 14-bit controllers would allow backward compatibility with existing MIDI 1.0 products (though it would pose significant implementation challenges on the 5-pin 31.25 kBit/s connection).
Hopefully the ongoing work on Profiles would also provide candidates for common controllers and SysEx parameters which could be redefined as Registered Controllers.
I feel that a document named "GM3" may shortly be made available?
Or not!
Hi,
Adding on to what Mike said, there are a lot of challenges with the MIDI 1.0 controller message area. There are only 128 CC messages and some of them are not controllers at all (Mode Messages for example). So MIDI 1.0 CC messages get overburdened and each available CC messages gets used for many different things in many different products. Also almost all software allows users to re-assign what is assigned to a controller (MIDI Learn).
So in MIDI 1.0, another device can never really know what the destination of a controller is (unless you are in GM mode which is very similar in concept to a Profile because it defines very specifically how a device responds to different controller messages).
But there are some ways we are looking at addressing the MIDI 1.0 CC area with other parts of MIDI 2.0. Profiles can address this because a Profile can define how a device responds to specific MIDI messages including MIDI 1.0 CC as well as defining new Registered Controller messages.
Here is some of the guidance that is included in the General Rules for Profiles.
3.1.1 Control Change, RPN and NRPN
Control Change messages should not gain new definitions within a Profile but should be reserved for legacy applications.
As a general rule (but not a hard requirement) new Profiles should define RPN messages to control parameters that have not previously been defined.
Manufacturer-Specific Profiles should use NRPN messages to control any function that does not have a Control Change or RPN defined for that function.3.4.1 ControlChange(CC)Messages
MIDI 1.0 has defined default assignment of Control Change messages to specific parameters. Profiles shall observe those legacy assignments. A list of Control Change messages that Profiles should use if they have the associated parameter is attached as Appendix A. At the time of enabling a Profile, the device shall reset these Control Change destination assignments. A Profile might also define that certain parameters should or shall be reset to their default values.3.6 Profile Categories
There are several common categories of Profiles. The category of a Profile is used to help organize or group similar Profiles into related sets. In particular, this is used to assign RPN numbers to parameters defined by each Profile.
Profiles are organized into 3 categories:
• Feature Profiles or Ancillary Profiles (General MIDI, Show Control, MIDI Visual Control, Note Tuning, MPE, Sample Dump, Mixed Data Types)
• Instrument Profiles (Orchestral Strings, Piano, Organ, Drums)
• Effects Devices (Reverb, Delay, Chorus, Overdrive, Rotary, etc.)
By the way, ALL Registered Controllers are reserved and Profiles will be adopted by the MMA and AMEI to define these new Registered Controllers.
There is already an overall map of we have already developed for Registered Controllers organized by Profile Categories.
Great, I look forward to seeing detailed Profile Specifications!
The General Rules for Profiles was already published and we have a number of specific Profiles we are working on already.
What would be great would if the people on this site would suggest the kinds of Profiles they would like to see.
A Draw Bar Organ Profile was already demoed 2 years ago. Okay, so what other Instrument Profiles are necessary?
Piano, Strings, Drums, etc.? How about Pipe Organ/Theatre Organ Profile?
What Effects Profiles are most important to you?
Reverb, Delays, Chorus, Distortion, etc.?
What Feature Profiles are important?
Do we need a replacement for Mackie Control? How about a Split/Layer Profile, Tuning Profiles, Profiles for Pad Controllers to change the Note Assignment of the pads to be able to switch scales on the fly or something else?
Let us know.
I sure would be interested in a profile for digital mixing/audio workstation control surfaces with the physical layout that follows the Selected Channel Strip paradigm. The eponymous Mackie Control Universal surface from 20 years ago was designed in quite a different era and different UI concepts in mind.
I can also imagine profiles that replicate physical control surfaces of classic analog synthesizers, or profiles to help emulate real-world wind, brass and string instruments on the keyboard. Something like OSC switch button that lets you select from sine, square, or tirangle waves. Maybe some rules for velocity/aftertouch/mod wheel transitions that let you approximate the effects of Breath Control on viratual acoustic (VL) synthesizers.
However general rules come best through real-world examples - it's always good to take look at existing implementations to see how they work with the current state of technology. There are too many variables involved, even vanilla sample-based engines are different enough between each manufacturers - parameters may overlap, but there is no 1:1 feature parity. And proprietary virtual analog and acoustic emulation technologies are even less likely to be bound to a common set of real-time controls....
There are a number of Profiles that are already well under way.
A prototype of a Draw Bar Organ Instrument Profile and a Rotary Speaker effect Profile were demoed in 2019.
I sure would be interested in a profile for digital mixing/audio workstation control surfaces with the physical layout that follows to the Selected Channel Strip paradigm. The eponymous Mackie Control Universal surface from 20 years ago was designed in quite a different era and different UI concepts in mind.
I can also imagine profiles that replicate physical control surfaces of classic analog synthesizers, or profiles to help emulate real-world wind, brass and string instruments on the keyboard.
All great candidates for Profiles and there have already been discussions about these particular items
Keep the ideas coming as it helps us make the case to get people in the MMA to do the work.
If MMA companies know their customers are requesting specific Profiles, it makes it easier to get them to commitment volunteer time to move that project forward.
In regards to the MIDI speed and CC's.
In a different thread the speed of MIDI 2.0 is discussed and that syncing to an Audio Rate would be possible.
When we have virtual MIDI 2.0 drivers that we can use for inter-application MIDI, will these drivers be able to sync to the same rate as the audio interface?
And when using these connections on a computer without additional MIDI 2.0 hardware or a 6MHz clock, can CC's be sample accurate?
> When we have virtual MIDI 2.0 drivers that we can use for inter-application MIDI, will these drivers be able to sync to the same rate as the audio interface?
> And when using these connections on a computer without additional MIDI 2.0 hardware or a 6MHz clock, can CC's be sample accurate?
The simple answer is no. However, it is not impossible to achieve.
The longer answer:
It depends completely on the operating system. It is up to the operating system if it can or will provide synchronization between audio and MIDI streams. Firewire 61883-6 combined audio and MIDI into one stream for MIDI timing that was referenced against an audio clock. But that was only in context of the transport. Doing the same on a system wide basis would be very difficult. Maybe someone proves me wrong but I don't believe that is going to happen any time soon.
One of the problems with chasing a system where MIDI is audio sample accurate is that renderers all have different timing characteristics. So even if you can deliver with accurate timing, the system may not know what different latencies and jitter exist in each of the synthesizers that are playing the data. The problem cannot be solved by a MIDI specification alone. It requires a specification that defines both audio and MIDI timing.
MIDI 2.0 can provide reasonably good timing, especially on a high speed transport. The optional JR Timestamps provided by the Universal MIDI Packet allow a message to be tagged with accuracy down to 1/32 of a millisecond. Multiple messages can be tagged with the same time. There is room to design a more comprehensive synchronization system, probably with a central clock and accounting for latencies of all the devices in a system. But it would take a long time to develop the mechanisms and the specification so we chose not to tackle that project for the initial release of MIDI 2.0. It is possible that there might be sufficient interest to drive that project some years down the road.
Mike.
Chair of MIDI 2.0 Working Group
Hi Mike,
Thank you for the detailed reply. It'll be interesting to see how the various DAW's will keep the signals aligned.
I was interested in the Audio Synchronization to try running MIDI from the Audio thread instead of the scheduler in Max/MSP.
Would there be any jitter/latency difference between a MIDI 2.0 conform IAC Driver and connecting a computer back to itself with a MIDI 2.0 ethernet connection?
In one of the presentations, there was a mention of a webmidi/electron implementation. Does node.js perform fast and consistent enough to keep up with the standard MIDI 2.0 speed?
Cheers,
Bjorn