fbpx
Skip to main content

MIDI Forum

MIDI Polyphonic Exp...
 
Notifications
Clear all

MIDI Polyphonic Expression

16 Posts
9 Users
0 Reactions
22.1 K Views
Paul DeRocco
Posts: 3
Active Member
Topic starter
 

I've read the thread on the MPE working group, and found a couple of draft specs online. I think the proposal is insanely awful. You begin with a MIDI spec that allows the playing of up to 128 notes on each of 16 channels, and after shoehorning a special MPE mode into it, you end up with a crippled spec that allows the playing of up to 15 notes on 1 channel, or a total of 14 notes on 2 channels. And it involves a heap of obscure complexity in controllers that need to talk to such synths.

What everyone wants could much more easily be obtained with the simple addition of three Control Change messages, allowing note pitches to be independent of their numbers, and allowing all controls to be addressed to individual notes. I propose the following language:

PITCH CONTROLS

Control Changes 102 and 103 are defined as the Pitch LSB and MSB controls. Either or both may precede a Note On on the same channel to alter its pitch. If both are specified, they may be in either order. The MSB specifies the semitone pitch, making the note number a mere label by which the note is identified when turning it off, or when addressing a control to it. The LSB provides a bipolar offset, where 0 is 50 cents flat, 64 is neutral, and 127 is almost 50 cents sharp. The Note On consumes the Pitch controls so that they have no further effect.

On output, Pitch MSB is only necessary if the pitch rounded to a semitone doesn't match the note number, and Pitch LSB is only necessary if the pitch has a fractional part.

Pitch controls may also be combined with a High Resolution Velocity Control Change 88 in any order.

NOTE CONTROLS

Control Change 104 is defined as the Note control. It's value is interpreted as a note number, and causes the next message on the same channel to be addressed only to that note, if possible. If that message isn't a Control Change 0 to 31, it also consumes the Note control, so that it has no further effect. If it is a Control Change 0 to 31, it tags the record of the note number so that it will only still be effective on the corresponding Control Change LSB, so that LSBs that follow their corresponding MSBs will not need a second Note control.

A regular message with no preceding Note control overrides any earlier messages addressed to individual notes, just as a Channel Pressure overrides any earlier Poly Pressures. Typically, a Note control will work with a limited set of other Control Changes (or MSB and LSB in that order), or a Pitch Bend. It should not work with Channel Pressure, since Poly Pressure is available for that purpose. If an NRPN is to be addressed to a note, the Note control must precede the Data, not the NRPN. If a Note control precedes a control that doesn't support it, the Note control should still be consumed, but the following control should be ignored, rather than interpreting it as a mono control.

I believe these changes would provide a framework for everything that people want from MPE, yet allow it all to be done on a single channel, without limiting the number of notes, and without any mode switching. The software changes needed in an instrument would generally be minor:

  • Internal Note On operations would have to carry a separate pitch value instead of using the note number.

    Controls that need to be made polyphonic would have to record 128 values. But that's already how Pressure works, so the mechanism is already understood.

    These "prefix" controls would have to be consumed by the messages that they modify. But that's already how High Resolution Velocity works, so the mechanism is already understood.

Any comments are welcome.

 
Posted : 29/12/2016 2:37 pm
Andrew Mee
Posts: 49
Admin
 

Yes I agree on first read of the MPE spec it seems convoluted and I have yet to hear why it needs to be this way and why any one instrument/controller needs a special MPE mode.
It seems to means that instruments/controllers are not backwards compatible with existing equipment.

In the thread you are referring to, I proposed using the almost unused RPN spec to handle the x/y/z controls proposed by the MPE spec.
I think one of the MPE ideals was a 14 bit resolution to the values. My understanding of your proposal is that it is only 7 bit, is that correct?

 
Posted : 23/01/2017 11:35 pm
JohnG
Posts: 225
Estimable Member
 

Forgive me if I misunderstand your proposal but how do I then ...

Start all notes of a played chord "on pitch", then modify just one note of the chord, changing its pitch, its timbre and its dynamics until note off.
The other notes must continue, as long as held, without modification.

 
Posted : 24/01/2017 2:09 am
Andrew Mee
Posts: 49
Admin
 

Forgive me if I misunderstand your proposal but how do I then ...

Start all notes of a played chord "on pitch", then modify just one note of the chord, changing its pitch, its timbre and its dynamics until note off.
The other notes must continue, as long as held, without modification.

Fair question I can't speak for this proposal exactly but I was imagining the following:
each note is between 0-127

When I hit a note I do a note on message for ch1 note 64 velocity 100 - which is the current standard
90 40 64 (in hex)
Using NRPN/RPN style you can then send control messages for a given note.

My proposal was to use RPN's
Currently the N/RPN send down a MSB and LSB of the control - currently only RPN MSB of 0 is used.

So I was suggesting that MPE messages would go down RPN MSB 1-3 For the x/y/z controls on a key
and the RPN LSB determined the note and the data (cc 6,38) sends the movement at 14 bit resolution (similar to pitch bend).

so.... say you move you finger on the note 64 left and right it would send
B0 64 01 <- RPNMSB 1 (so x movement , y would equal 2 etc.)
B0 65 40 <- RPNLSB 40 (so note 64 in hex)
B0 06 xx <- the MSB of the 14bit data
B0 26 xx <- the LSB of the 14bit data

This way you can have each key truly polyphonic and truly polyphonic in movement.

The other benefits of this (vs MPE) is you then don't need to have special MPE modes if the device can't cope then it will just ignore the message and still play the note (like channel aftertouch) and we don't have ruin the rest of the midi channels. For some of us who prefer Dawless setups this is a big issue as we need as many midi individual channels as we can get.

However maybe I have missed something in the MPE spec and maybe my ideas don't cut it either but I would like to see some flaws in this vs MPE.I would hate to think MPE gets rubber stamped just because it's in the public.

 
Posted : 26/01/2017 6:26 pm
Jeff McClintock
Posts: 4
Active Member
 

>I've read the thread on the MPE working group, and found a couple of draft specs online. I think the proposal is insanely awful.

insanely awful is an understatement. Especially considering the fact MIDI *already* has a per-note tune message (14-bit resolution). I am using it my VSTi synths to support microtuning, stacked unison, MIDI guitars etc.

https://www.midi.org/specifications/item/the-midi-1-0-specification

[SINGLE NOTE TUNING CHANGE (REAL-TIME)]
The single note tuning change message (Exclusive Real Time sub-ID#1 = 08) permits on-the-fly adjustments to any tuning
stored in the instrument's memory. These changes should take effect immediately, and should occur without audible artifacts
if any affected notes are sounding when the message is received.
F0 7F 08 02 tt ll [kk xx yy zz] F7
F0 7F Universal Real Time SysEx header
ID of target device
08 sub-ID#1 (MIDI Tuning)
02 sub-ID#2 (note change)
tt tuning program number (0 – 127)
ll number of changes (1 change = 1 set of [kk xx yy zz])
[kk] MIDI key number
[xx yy zz] frequency

Likewise MIDI *already* supports per-note controllers. I am using them to provide, for example, per-note panning (very cool).

[UNIVERSAL REAL TIME SYSTEM EXCLUSIVE]
KEY-BASED INSTRUMENT CONTROL
F0 7F 0A 01 0n kk [nn vv] .. F7
F0 7F Universal Real Time SysEx header
ID of target device (7F = all devices)
0A sub-ID#1 = β€œKey-Based Instrument Control”
01 sub-ID#2 = 01 Basic Message
0n MIDI Channel Number
kk Key number
Confirmation of Approval for MIDI Standard CA# __23__
Page 2 of 2
[nn,vv] Controller Number and Value
F7 EOX

The specs are here on the MIDI website. How the heck is an entire workinggroup proceeding without knowing this???

 
Posted : 07/02/2017 3:03 pm
Jeff McClintock
Posts: 4
Active Member
 

>I've read the thread on the MPE working group, and found a couple of draft specs online. I think the proposal is insanely awful.

insanely awful is an understatement. Especially considering the fact MIDI *already* has a per-note tune message (14-bit resolution). I am using it my VSTi synths to support microtuning, stacked unison, MIDI guitars etc.

https://www.midi.org/specifications/item/the-midi-1-0-specification

[SINGLE NOTE TUNING CHANGE (REAL-TIME)]
The single note tuning change message (Exclusive Real Time sub-ID#1 = 08) permits on-the-fly adjustments to any tuning
stored in the instrument's memory. These changes should take effect immediately, and should occur without audible artifacts
if any affected notes are sounding when the message is received.

F0 7F 08 02 tt ll [kk xx yy zz] F7
F0 7F Universal Real Time SysEx header
ID of target device
08 sub-ID#1 (MIDI Tuning)
02 sub-ID#2 (note change)
tt tuning program number (0 – 127)
ll number of changes (1 change = 1 set of [kk xx yy zz])
[kk] MIDI key number
[xx yy zz] frequency

Likewise MIDI *already* supports per-note controllers. I am using them to provide, for example, per-note panning (very cool).

[UNIVERSAL REAL TIME SYSTEM EXCLUSIVE]
KEY-BASED INSTRUMENT CONTROL
F0 7F 0A 01 0n kk [nn vv] .. F7
F0 7F Universal Real Time SysEx header
ID of target device (7F = all devices)
0A sub-ID#1 = β€œKey-Based Instrument Control”
01 sub-ID#2 = 01 Basic Message
0n MIDI Channel Number
kk Key number
[nn,vv] Controller Number and Value
F7 EOX

The specs are here on the MIDI website. How the heck is an entire workinggroup proceeding without knowing this???

 
Posted : 07/02/2017 3:04 pm
MiΓ«tek Bak
Posts: 2
New Member
 

I was only able to find early drafts of the MPE specification: MPE v1.10, dated 2015-09-21, and MPE v1.25a, dated 2015-12-15.

The latter document says, β€œThe MPEWG home page, containing the most recent specification, is available to MMA members only at members.midi.org/mpewg-home.shtml.” This is strange for a number of reasons:

  • The MPE spec doesn’t appear to be listed on the Specs page;
  • Accessing the above URL requires sending a username and password over an unencrypted connection;
  • My username and (throw-away) password are not accepted.

Does anyone have access to the current MPE spec?

 
Posted : 12/02/2017 10:21 pm
Clemens Ladisch
Posts: 323
Reputable Member
 

midi.org is the homepage for both The MIDI Association (TMA) and the MIDI Manufacturers Association (MMA).

You are a TMA member but not an MMA member.

 
Posted : 13/02/2017 1:13 am
MiΓ«tek Bak
Posts: 2
New Member
 

Thanks for clearing that up. I’m not a manufacturer β€” I’m a programmer, and I’d like to write software that communicates with MPE-enabled hardware. It’s not clear to me why should I have to reverse-engineer the current version of the spec, instead of simply being able to read it.

 
Posted : 13/02/2017 8:18 pm
JohnG
Posts: 225
Estimable Member
 

It's only my guess (like you, not being a member of the MMA), but I suspect the reason is commercial.

The MMA is a club for "manufacturers".
I suspect that the club members are expected to make a contribution toward the existence/running of the MMA.
Those club members get first access, due to their contributions, to new material.
Those of us who want it all free have to wait a while.
The MIDI specification only very recently became free to download.

Perhaps those expert members who give their time and spend the expenses to participate in the creation of new standards, should expect some recompense?
Manufacturers should be given the opportunity to implement the new standards in their hardware/software before the those standards are released to the general public.
Or am I totally wrong?
MIDI has never been, unlike e.g. TCP/IP or HTML an open standard. It has always belonged to the MMA.
I had to pay for my copies of the MIDI specification and for GM2.

But it's only my point of view.

 
Posted : 14/02/2017 4:46 am
Andrew Mee
Posts: 49
Admin
 

>I've read the thread on the MPE working group, and found a couple of draft specs online. I think the proposal is insanely awful.

insanely awful is an understatement. Especially considering the fact MIDI *already* has a per-note tune message (14-bit resolution). I am using it my VSTi synths to support microtuning, stacked unison, MIDI guitars etc.

....

The issue with using sysex is that it a lot of data and (from my understanding) that while the unit is receiving Sysex it should not be sent items like MTC or active sense or any other buffered data on other channel. It's not good for continuous streams of data, especially on a busy MIDI port. Most hardware I have seen will stop what it is doing upon receiving Sysex data and wait for completion.

 
Posted : 16/02/2017 2:55 am
MMA
 MMA
Posts: 25
Admin Registered
 

Maybe I can help reduce some of the confusion in this thread...

>>> β€œThe MPEWG home page, containing the most recent specification, is available to MMA members only at members.midi.org/mpewg-home.shtml.”

FYI, that link is to the MMA's specification development site. That is where specs that are not yet adopted are stored, for MMA Members to discuss. When they are adopted they become available to the public on this website (www.midi.org). Only companies (hardware and software) that are paid members of the MMA have access to those files.

>>> It’s not clear to me why should I have to reverse-engineer the current version of the spec, instead of simply being able to read it.

MMA does not publish specs until they have been adopted by a vote of MMA Members (and our partners in Japan, AMEI). The "current version" referred to in that link is the current draft for MMA Members to discuss... it is not an adopted spec and is not intended for use in any commercial products.

Roli published some early drafts on their own before coming to MMA with the proposal, and some companies may have released products based on those drafts, but there are no products that comply with the MMA MPE Specification yet because the MMA MPE Specification has not been finished yet. When it is finished it will be published here (www.midi.org) and then anyone can read it.

>>> while the unit is receiving Sysex it should not be sent items like MTC or active sense or any other buffered data on other channel.

There are "Real-time" and Non-real-time" SysEx Messages, and the Real-time ones are intended to used in a performance, i.e. without stopping the handling of any other MIDI messages.

- Forum Admin

 
Posted : 16/02/2017 5:19 pm
Geert Bevin
Posts: 1
Admin
 

MPE has been designed to be as minimal a leap as possible for existing product developers and existing MIDI users.

If you play an expression source that doesn't have per-note capabilities (like most traditional keyboard), all the messages are backwards compatible.

In short:
Each finger becomes a traditional single channel MIDI device. Existing messages of the MIDI 1.0 spec (note on/off, pitchbend, channel pressure, CC74) all function the same way and become assigned to a particular finger for the duration of a note on a particular MIDI channel.

(most controllers capable of generating the appropriate real-time expression data naturally gravitated to this model before the MPE spec but with slight variations: Continuum, Eigenharp, Seaboard, LinnStrument, ...)

This leaves two issues to be solved:

* global messages like for a pitchbend wheel, sustain pedal, sostenuto pedal, breath, ... need to apply to all fingers and are put on a separate main channel so that their intent is clear and can be properly handled in DAWs (Logic Pro X has already been supporting this for years in their MIDI mono mode with common base channel)
* devices need to agree upon a number of things: which channels to listen on, what the pitchbend range is and whether MPE is active or not. This is planned to be captured in an RPN for convenience. It's still completely possible though for users to configure this themselves.

We've received a lot of feedback from product manufacturers and this approach does indeed seem to make it very easy to add MPE support to existing products, and to integrate MPE into an existing MIDI setup without having to change anything. This can be seen by the number of products already claiming MPE support even before a final version of the spec has been released, just by working from old drafts that existed before the MME Working Group.

As an example, you can very well craft your own synth that responds to MPE controllers by taking a series of analog mono synths. Setting them to the same patch and pitchbend range, having each one respond to a dedicated MIDI channel, and you're set.

I hope that clarifies the approach that we've taken with MPE.

 
Posted : 16/02/2017 11:27 pm
Andrew Mee
Posts: 49
Admin
 

MMA does not publish specs until they have been adopted by a vote of MMA Members (and our partners in Japan, AMEI). The "current version" referred to in that link is the current draft for MMA Members to discuss... it is not an adopted spec and is not intended for use in any commercial products.

This is my biggest issue with MIDI specs/MMA. while this may have acceptable so 30 years ago, most other organisations have moved away from being so behind doors and not letting anyone who doesn't have the cash to have a say.

Roli published some early drafts on their own before coming to MMA with the proposal, and some companies may have released products based on those drafts, but there are no products that comply with the MMA MPE Specification yet because the MMA MPE Specification has not been finished yet. When it is finished it will be published here ( http://www.midi.org) and then anyone can read it.

However there is such a push by Roli/others that it's likely to go through with only minor changes, and as suggested in another post quite a few manufacturers are taking this draft protocol on - just like what happened with the BLE spec. It seems that with the MMA/AMEI it's better to ask for forgiveness than ask for permission πŸ™‚

In short:
Each finger becomes a traditional single channel MIDI device. Existing messages of the MIDI 1.0 spec (note on/off, pitchbend, channel pressure, CC74) all function the same way and become assigned to a particular finger for the duration of a note on a particular MIDI channel.

This is the part I don't like! I mean that the sending and receiving equipment must be MPE->MPE no Traditional MIDI->MPE or MPE->Traditional MIDI. Sure the equipment may have different modes but effectively it's a hijack of the existing spec and shoe-horning a solution. Also since each midi device only has 16 channels this means that each MPE is likely to take up a whole device. So those of us in the hardware non-usb world this mean no channel sharing between synths.. not everything is on a computer!

Don't get me wrong I like the concept of MPE - just not it's current (?) implementation.

This leaves two issues to be solved:

* global messages like for a pitchbend wheel, sustain pedal, sostenuto pedal, breath, ... need to apply to all fingers and are put on a separate main channel so that their intent is clear and can be properly handled in DAWs (Logic Pro X has already been supporting this for years in their MIDI mono mode with common base channel)

Does this mean you now have to send the data down x times per channel used for MPE? this seems wasteful.

As an example, you can very well craft your own synth that responds to MPE controllers by taking a series of analog mono synths. Setting them to the same patch and pitchbend range, having each one respond to a dedicated MIDI channel, and you're set.

While this may be financially practical using VST's I can't see this working real well using hardware ;P

I hope that clarifies the approach that we've taken with MPE.

Thanks for the clarification it's appreciated.

 
Posted : 23/02/2017 2:07 pm
Brian Willoughby
Posts: 2
New Member
 

MPE has been designed to be as minimal a leap as possible for existing product developers and existing MIDI users.

If you play an expression source that doesn't have per-note capabilities (like most traditional keyboard), all the messages are backwards compatible.

Except that MPE violates two or three aspects of the MIDI Specifications, for no good reason. Use of 14-bit CC messages outside the range set aside for them, defeat of several MIDI Channel Mode 4 conventions, etc. The MPE needs a bit of refining and a few alterations before it truly represents a minimal leap that is actually backwards compatible.

In short:
Each finger becomes a traditional single channel MIDI device. Existing messages of the MIDI 1.0 spec (note on/off, pitchbend, channel pressure, CC74) all function the same way and become assigned to a particular finger for the duration of a note on a particular MIDI channel.

Except that CC74 is being used as a 14-bit control, despite the fact that the MIDI spec marks that as being limited to 7 bits. There are still 16 undefined CC numbers that are available. Why not use CC3, CC9, CC14, CC15, or CC20 through CC31? Embedded and other software systems that use bit masks to distinguish potential 14-bit Control numbers from 7-bit will be confused. This is the reason that the original MIDI specification set aside a range for 14-bit MSB+LSB pairs. I see no reason for MPE to deviate from this part of the existing specification when there are still 16 Control numbers available that are currently undefined. Perhaps some of those 16 have been taken by recent additions to MIDI that I missed, but there has to be at least 2 undefined that could be used by MPE.

(most controllers capable of generating the appropriate real-time expression data naturally gravitated to this model before the MPE spec but with slight variations: Continuum, Eigenharp, Seaboard, LinnStrument, ...)

You left off the Soundplane, which demonstrated multi-touch performance expression using MIDI Channel Mode 4. The Soundplane is able to control pitch, volume, and timbre for up to 16 touches, as demonstrated with an Oberheim Matrix-12 synthesizer with no firmware changes since 1984! The best way to encourage MPE is to adopt all aspects of the 1984 MIDI Spec that work!

This leaves two issues to be solved:

* global messages like for a pitchbend wheel, sustain pedal, sostenuto pedal, breath, ... need to apply to all fingers and are put on a separate main channel so that their intent is clear and can be properly handled in DAWs (Logic Pro X has already been supporting this for years in their MIDI mono mode with common base channel)

This is not an issue to be solved - it was solved by the original MIDI spec when Mode 4 defined a global control channel as 1 less than the receiving MIDI device's Basic Channel. If there is an issue, then it is to educate MIDI manufacturers about this part of the spec that has existed since the beginning.

* devices need to agree upon a number of things: which channels to listen on, what the pitchbend range is and whether MPE is active or not. This is planned to be captured in an RPN for convenience. It's still completely possible though for users to configure this themselves.

This part of MPE is the one addition to MIDI that I think is useful. It's even useful enough that it should be separated from MPE and proposed as its own change. Since the beginning of MIDI, there has not been a way for a MIDI transmitter to set the Basic Channel of a MIDI receiver. That's partly a necessary restriction, because if there are multiple devices on a MIDI bus then any command to set the Basic Channel would necessarily affect all devices - that would be a mess. From the beginning, MIDI assumed that users would manually interact with MIDI receivers to set their Basic Channel so that such confusion could be avoided. That said, there might be limited situations where remote abilities to set the Basic Channel could be useful. Perhaps the MPE-suggested RPN could be expanded in some way so that it could address a specific device on a MIDI bus instead of all of them.

We've received a lot of feedback from product manufacturers and this approach does indeed seem to make it very easy to add MPE support to existing products, and to integrate MPE into an existing MIDI setup without having to change anything. This can be seen by the number of products already claiming MPE support even before a final version of the spec has been released, just by working from old drafts that existed before the MME Working Group.

As an example, you can very well craft your own synth that responds to MPE controllers by taking a series of analog mono synths. Setting them to the same patch and pitchbend range, having each one respond to a dedicated MIDI channel, and you're set.

How does this differ from how very easy it is for product manufacturers to implement MIDI Channel Mode 4, with its per-note expression as well as defined global controls? As mentioned above, the Soundplane can control a Matrix-12 will full expression. Although Oberheim is back (yay!) as a MIDI Manufacturer, they probably do not have the means to go back and rewrite the 1984 firmware in the Matrix-12 to implement MPE, but that's a moot point because it already works.

I hope that clarifies the approach that we've taken with MPE.

It is unclear what MPE adds to MIDI beyond what has already been available since 1984 in MIDI Channel Mode 4. Granted, MPE suggests a way to set the Basic Channel remotely, and adds polyphonic abilities to each channel to go beyond the 16-note limitation. However, MPE admits that polyphonic notes sacrifice the ability to support per-note expression, and so I do not see the point of that feature. Besides, who has more than 16 fingers, much less 10? Remote changes to the Basic Channel is perhaps interesting, but is fraught with its own problems. The ability of MPE to layer multiple touch channel zones is interesting, and that might be something to clarify - however, I believe that MIDI Channel Mode 4 also supports this same zone concept by having multiple Basic Channels in a device with virtual instruments or voice clusters.

 
Posted : 15/03/2017 2:38 pm
Page 1 / 2
Share: