fbpx
Skip to main content

MIDI Forum

What will the MIDI ...
 
Notifications
Clear all

What will the MIDI 2.0 speed be?

19 Posts
8 Users
0 Reactions
30 K Views
Dmitry
Posts: 27
Eminent Member
 

MPE seems to be a bodge to get around the limits of only one PolyAftertouch channel and destroys multitimbrality

It's not as dramatic, the MPE protocol can use whatever number of channels is configured by end user with the Registered Control Number (RPN) message 0006 'MPE Control Messsage' (MCM).

by having it as part of midi2 does that mean no more multitimbral synths?
Shouldn't it be left as a hack/bodge and not implemented as part of the standard.

Absolutely not.

The new MIDI 2.0 protocol includes note-on events with 16-bit velocity and 16-bit per-note articulation (aftertouch, plucking, glide etc.) or other attributes (tuning etc.), as well as per-note pitchbend and per-note continuous controllers with even finer 32-bit resolution, and a total of 256 MIDI channels (in 16 groups of 16 channels) - that's a lot of multitimbrality to me.

On the other hand, MIDI 2.0 is also designed to fully support the existing ecosystem. It uses "Universal MIDI Packets" which can encapsulate new MIDI 2.0 events or the old 1.0 events within each group of 16 channels. This makes it easy for 2.0 devices to maintain compatibility with 1.0 devices on the legacy 5-pin DIN port, which is limited to 16 channels. And since the new protocol is a superset of the old, translation from MIDI 2.0 Protocol events to 1.0/MPE events (and vice versa) should be pretty straightforward, even though some loss of resolution will occur.

More than that, MIDI 2.0 devices are explicitly allowed to use the old 1.0 events for their internal synth engine states, but implement new physical transports, MIDI-CI and the MIDI 2.0 packet-based protocol - that provides manufacturers with an easy adoption path, by enhancing existing MIDI 1.0 hardware solutions with support for additional MIDI channels and better timing precision.

In simpler words, MIDI 1.0 sources can use up to 256 channels for MIDI Polyphonic Expression (MPE) if they choose to implement MIDI 2.0 Universal Packet over USB or Ethernet/AVB connections, and MIDI 2.0 sources will use native per-note events for expression control.

https://en.wikipedia.org/wiki/MIDI#MIDI_2.0
https://www.midi.org/articles-old/details-about-midi-2-0-midi-ci-profiles-and-property-exchange

seems like some people want to have their cake & eat it.
MIDI 2.0 should be an all new standard and have a clean break from the old MIDI 1.0
there is no point in trying to shoehorn 1.0 comparability into 2.0 it will just make 2.0 worse.

Tell that to hundred millions of professional users who still use MIDI 1.0. There were attempts to specify non-compatible replacement protocols like Zeta ZIPI and CNMAT Open Sound Control, and they never took off with the musical instrument industy or softsynth developers, simply because of the enormous installed base of MIDI 1.0 devices and software.
Full backward compatibility has been a clearly stated goal for the whole 15 years of 'High-Definition Protocol' development.

 
Posted : 25/07/2019 5:12 pm
Dmitry
Posts: 27
Eminent Member
 

As for connections it all needs to be done through Ethernet it's much more reliable than USB when you start scaling your setup

Fully agree here. USB is fine for point-to-point connections like controller-to-computer and instrument-to-computer, but not so much for instrument-to-instrument, controller-to-instrument, or multiple controller-instrument-computer connections in a production network.

The backbone of the transmission interface for the new MIDI 2.0 Protocol shall be

1) Physical layer - Ethernet over twisted pair with Audio Video Bridging (AVB) / Time-Sensitive Networking (TSN), a low-latency tight-sync switching extension of the Ethernet protocol, based on international standard IEEE 802.1Q.

2) Transport layer - IEEE-1722 Layer 2 AV Transport Protocol (AVTP) and IEEE 1733 Layer 3 Transport Protocol extensions for AVB/TSN, based on RFC 3550 Real-time Transport Protocol (RTP). MIDI support is already there as RFC6295 RTP-MIDI transport (implemented as AppleMIDI in OS X), and the MMA is currently working to update the AVTP MIDI specifications.

3) Configuration - IEEE 1722.1 AVDECC Protocol (Discovery, Enumeration, Connection management and Control)

The AVB suite of standards is marketed as MILAN by the Avnu Alliance, and well known players like MOTU and Presonus already offer products for AVB/TSN audio networks - they will no doubt follow with MIDI 2.0 interfaces as soon as the AVTP-MIDI and RTP-MIDI specifications are updated.

 
Posted : 25/07/2019 5:48 pm
Love
 Love
Posts: 1
New Member
 

Both Ethernet (AVB) and USB transports provide the bandwidth necessary to handle MIDI 2.0’s denser messages.
Both are on the roadmap for future MIDI expansion with USB being the priority. Also JIitter Time Stamps will help with timing.

When 5G wireless networks are available, people will be surprised by the speed. Even MIDI 2.0 is tiny compared to streaming video data.

The reason some people complain about USB speed is because some applications throttle what they send out to MIDI 1.0 speeds because currently the computer does not know if the external MIDI Out connection is going to a device connected via USB or 5 Pin DIn.
That issue could be addressed in designing a new USB MIDI 2.0 spec. Remember the current USB MIDI 1.0 spec is 20 years old

As for 5 PIN DIn, Electron had products several years ago that had 10 times the speed of MIDI 1.0 on the same 5 Pin DIN connector, but with a slight modification of the part for the UART. MIDI-CI provides a method for negotiation so it would be possible for two devices to negotiate to a higher bandwidth on 5 Pin DIN. MIDI - CI makes a lot of things possible and we just need to prioritize the work that will get done next.

The forum on MIDI.org is absolutely the right place for people who care about these issues about MIDI to provide their input.

Although we can’t actually reveal details of future MMA plans, the people who help prioritize the work in the MMA do read the forums.

Hi, so latency will be improved? I was just trying improve my setup sync and it seems this where I will be have to concede to imperfection in the current standard... which is about 5ms up to 7ms or 8ms because of jitter.

my MIDI usb devices are about 1-3ms.

I suppose if Jitter is reduced enough, then compensation protocol can be effectively implemented DAW side to correct the late recording.. such as with ASIO latency and plugin latency compensation?

Can sample accurate latency compensation be possible with MIDI 2.0?

 
Posted : 14/11/2019 10:13 am
Dmitry
Posts: 27
Eminent Member
 

JR Timestamp messages are intended as a prefix to Channel Voice messages (i.e. notes/controllers/pitch-bend/pressure), so that several events in a series are scheduled to sound at a specified time.
They have a resolution of 32 μs (microseconds, i.e. 1 MHz / 32 = 31,25 kHz clock, similar on MIDI 1.0 bitrate) - see the ADC19 MIDI 2.0 presentation starting from 24:50.

This is vastly better than the MIDI 1.0 beat clock of 24 pulses per quarter note (PPQ), and better than Standard MIDI Files (SMF) which typically use up to 960 PPQ resolution (though the SMF format itself is precise down to either 16384 PPQ or 256 ticks per SMPTE frame, which resolves to either 31.5 μs (32768 Khz) at 120 beats per minute (BPM) or 130 μs = 0.13 ms (7680 kHz) at 30 fps).

It's still not "sample-accurate" though - this IMHO would either require a clock that is an exact multiple of 48 kHz (20.8 μs), or a much higher clock of 6 MHz (a least common multiple of 48000 and 31250).

The clock field uses 16 bits while another 4 bits are reserved in the JR Timestamp message, so hopefully higher clocks of 62.5/125/250 Khz - or alternate base clocks like 48/96/192 kHz - could be supported in a future revision.

As for latency, this depends on the transmission medium. The JR Timestamp message only includes the source clock in order to re-arrange events at the sink - but if messages do not arrive within specified time interval (defined by receive buffer size), timestamps are useless.
USB should have the best latency when used for direct to PC connection. For networked setup (AVBTP and RTP-MIDI), you would need to use a high-speed connection like 2.5/5/10/25/40G Ethernet and employ switching equipment that supports AVB/TSN protocols in order to minimize congestion delays.

 
Posted : 20/02/2020 3:57 pm
Page 2 / 2
Share: