fbpx
Skip to main content

MIDI Forum

Musical Pitch Mappi...
 
Notifications
Clear all

Musical Pitch Mapping: Continuous pitch approach for monophonic playing

4 Posts
4 Users
0 Reactions
4,148 Views
Jorge
Posts: 1
New Member
Topic starter
 

I would like to discuss about the way MIDI protocol maps the musical pitch.
Since thousands of years ago societies have discretized (divided) the musical octave to establish different musical systems, so we have assumed discretization as the natural way of representing music. But at the same time we have been constantly using the oldest instrument in the world, our voice, which is one that it is not discretized in any way.

So my question is: Which reasons might prevent us from representing tones merely with their fundamental frequencies? Since we don't really need a 'fretted' voice or a fretted violin to play in tune, one might just prefer to send frequency/pitch MIDI messages from a MIDI controller instead of an arbitrary tone number that must be remapped each time another musical system is used.
Monophonic virtual instruments would benefit from this approach, since there would be no need to send MIDI note off messages for each tone that is played. At the same time there would be no need for additional pitch modulating messages, since the performer would modulate it in the same way vibrato or glissando is played in a violin or when singing. Some new commercial controllers go in this direction.
I don't mean the current MIDI protocol should be modified, but i think it should add a simplified protocol for monophonic playing so performers are no longer constrained to preset parameters like portamento time, pitch range, vibrato amplitude and frequency...

 
Posted : 23/03/2023 11:59 am
Jeff McClintock
Posts: 4
Active Member
 

MIDI supports sending pitch information to a playing note.

You can do so to an entire channel via Pitch-Bender messages.
You can do so to an individual note/key via MIDI Realtime Tuning change messages. (a SYSEX message).

more recently the MPE standard provides for bending the pitch of any individual note.
even more recently, MIDI 2.0 allows direct control of a notes pitch right in the note-on message, and also while the note plays.

MIDI 2.0 is available on Android, macOS and IOS right now.

Microsoft seems to be dragging their feet, (I guess a 1 TRILLION dollar valuation does not give them the resources to write a basic device driver in a timely manner).

 
Posted : 23/03/2023 9:31 pm
Clemens Ladisch
Posts: 323
Reputable Member
 

Which reasons might prevent us from representing tones merely with their fundamental frequencies?

Interoperability. Both controllers and synthesizers must support it.

For most manufacturers, implementing such a new protocol would be worth it only if it would be able to unlock new capabilites, but not if it merely represents the same information in a better way.

 
Posted : 23/03/2023 11:07 pm
Pedro Lopez-Cabanillas
Posts: 154
Estimable Member
 

our voice, which is one that it is not discretized in any way

I don't agree. Singers play discrete frequencies (notes) like any other instrument. Vibrato, Glissando and Portamento are ornamentations or effects.

Your proposal is similar to OSC, which was for some people the MIDI killer 😀

The fact is that MIDI is more about music than a mere protocol for machines to talk among them. It is a music codification, a music language dialect, so musical symbols (like notes) are the core of it.

 
Posted : 27/03/2023 10:34 am
Share: