fbpx
Skip to main content

Solve Problems with MIDI Plug-Ins

A DAW’s MIDI Plug-Ins Can Provide Solutions to Common Problems


In a world obsessed with audio plug-ins, MIDI plug-ins may not seem sexy—but with MIDI’s continued vitality, they remain very useful problem solvers. For an introduction to MIDI plug-ins, please check out the article Why MIDI Effects Are Totally Cool: The Basics.

Although processing MIDI data has existed since at least the heyday of the Commodore-64, the modern MIDI plug-in debuted when Cakewalk introduced the MFX open specification for Windows MIDI plug-ins. Steinberg introduced a wrapper for MFX plug-ins, and also developed a cross-platform VST format. MIDI plug-ins run the gamut from helpful utilities that supplement a program like MOTU Digital Performer, to beat-twisting effects for Ableton Live. After Apple Logic Pro X added Audio Units-based MIDI plug-ins, interest continued to grow. Typically, MIDI plug-ins insert into MIDI tracks similarly to how audio plug-ins insert into audio tracks (Fig. 1). 

Figure 1: In Cakewalk by BandLab, you can drag MIDI plug-ins from the browser into a MIDI track’s effects inserts.

Unfortunately most companies lock MIDI plug-ins to their own programs. Therefore this article takes a general approach that describes typical problems you can solve with MIDI plug-ins, but note that not all programs have plug-ins that provide these functions, nor do all hosts support MIDI plug-ins.

Instant Quantization for Faster Songwriting

MIDI plug-ins are generally real-time and non-destructive (some can work offline as well). If you’re writing a song and craft a great drum groove that suffers from shaky timing, don’t dig into the quantization menu and start editing—insert a MIDI quantizing plug-in, set it for eighth or 16th notes, and keep grooving. You can always do the “real” edits later.

Create Harmonies, Map Drums, and Do Arpeggiations

If your host has a Transpose MIDI plug-in, it might do a lot more than audio transposition plug-ins—like transpose by intervals or diatonically, change scales in the process of transposing from one key to another, or create custom transposition maps that can map notes to drums. The image above shows a variety of MIDI plug-ins; clockwise from upper left is the Digital Performer arpeggiator, Live arpeggiator, Cubase microtuner, Live randomizer, Cubase step sequencer, Live scale constrainer, Digital Performer Transposer, Cubase MIDI Echo.

Filter Data

You’re driving two instruments from a MIDI controller, and want one to respond to sustain but not the other…or filter out pitch bend before it gets to one of the instruments. Data filtering plug-ins can implement these applications, but many can also create splits and layers. If the plug-in can save presets, you can instantly call up oft-used functions (like remove aftertouch data).

Re-Map Controllers

Feed your footpedal through a re-mapping plug-in to control breath control parameters, mod wheel, volume, aftertouch, and the like. There may also be an option to thin or randomize control data, or map data to a custom curve.

Process MIDI Data Dynamically

You can compress, expand, and limit MIDI data (to low, high, or both values). For example, a plug-in could specify that all values under a certain value adopt that value, or compress velocity dynamics by a ratio, like 2:1. While you don’t need a MIDI plug-in to do these functions (you can usually scale velocities, then add or subtract a constant using traditional MIDI processing functions), a plug-in is more convenient.

MIDI Arpeggiation Plug-Ins

Although arpeggiation isn’t as front and center in today’s music as it was when Duran Duran was tearing up the charts, it’s still valid for background fills and ear candy. With MIDI plug-in arpeggiator options like multiple octaves, different patterns, and rhythmic sync, arpeggiation is well worth re-visiting if you haven’t done so lately. Arpeggiators can also produce interesting patterns when fed into percussion tracks.

“Humanize” MIDI Parts so They Sound Less Metronomic

“Humanizer” plug-ins usually randomize parameters, like start times and/or velocities, so the MIDI timing isn’t quite so rigid. Personally, I think they’re more accurately called “how many drinks did the player have” because musicians tend not to create totally random changes. But taking a cue from that, consider teaming humanization with an event filter. For example if you have a string of 16th note hi-hat triggers, use an event filter to increase velocities that fall on the first note of a beat, and perhaps add a slight increase to the third 16th note in each series of four. Then if you humanize velocity slightly, you’ll have a part that combines conscious change with an overlay of randomness.

Go Beyond Traditional Echo

Compared to audio echo, MIDI echo can be far more flexible. Fig. 2 shows, among other MIDI plug-ins, Cakewalk’s MIDI Echo plug-in.

Figure 2: Clockwise from upper left, Logic Pro X Randomizer and Chord Trigger, Cakewalk Data Filter, Echo, and Velocity processor.

Much depends on a plug-in’s individual capabilities, but many allow variations on the echoes—change pitch as notes echo, do transposition, add swing (try that with your audio plug-in equivalent), and more. But if those options aren’t present, there’s still DIY potential because you can render the track with a MIDI plug-in, then tweak the echoes manually. MIDI echo makes it particularly easy to generate staccato, “dugga-dugga-dugga” synth parts that provide rhythmic underpinnings to many dance tracks; the only downside is that long, languid echoes with lots of repeats eat up synth voices.

Experiment with Adding Human “Feel”

A Shift MIDI plug-in shifts note start times forward or backward. This benefits greatly from MIDI plug-ins’ real-time operation because you can listen to the changes in “feel” as you move, for example, a snare hit ahead or behind the beat somewhat.

Remove Glitches

“De-glitcher” plug-ins remove duplicate events that hit on the same beat, filter out notes below a specific duration or velocity, “de-flam” notes to move the start times of multiple out-of-sync notes to the average start time, or other options that help clean up pollution from MIDI data streams.

Constrain Notes to a Scale, and Nuke Wrong Notes

Plug-ins that can snap to scale pull errant notes into a defined scale—just bash away at a keyboard (or have a cat walk across it), and there won’t be any “wrong” notes. Placing this after a randomizer can be very interesting, as it offers the benefits of randomness yet notes are always constrained to particular scales.

Analyze Chords

Put this plug-in on a track, and it will read out the kind of chord made by the track’s notes. With ambiguous chords, the analyzer may display all voicings it recognizes. Aside from figuring out exactly what you played when you had a spurt of inspiration, for those using MIDI backing tracks an analyzer simplifies figuring out chord progressions.

Add an LFO to Just About Anything

Being able to change MIDI parameters rhythmically can add considerable interest and animation to synth modules and MIDI-controllable signal processors. Although some DAWs let you draw in periodic waveforms (and you can always take the time to create a library of MIDI continuous controller signals suitable for pasting into programs), a Continuous Controller generator provides these same functions in a much more convenient package.

The above functions are fairly common—but scratch beneath the surface, and you’ll find all kinds of interesting MIDI plug-ins, either bundled with hosts or available from third parties. Midiplugins.com lists MIDI plug-ins from various companies. Some of the links have disappeared into internet oblivion and some belong to zombie sites, but there are still plenty of potentially useful MIDI effects. More resources are available at midi-plugins.de, (the most current of the sites), and tencrazy.com. Happy data diving!

Mixing with Virtual Instruments: The Basics

DAW software, like Ableton Live, Logic, Pro Tools, Studio One, etc. isn’t just about audio. Virtual instruments that are driven by MIDI data produce sounds in real time, in sync with the rest of your tracks. It’s as if you had a keyboard player in your studio who played along with your tracks, and could play the same part, over and over again, without ever making a mistake or getting tired.

MIDI-compatible controllers, like keyboards, drum pads, mixers, control surfaces, and the like, generate data that represents performance gestures (fig. 1). These include playing notes, moving controls, changing level, adding vibrato, and the like. The computer then uses this data to control virtual instruments and effects.

Figure 1: Native Instruments’ Komplete keyboards generate MIDI data, but can also edit the parameters of virtual instruments.

Virtual Instrument Basics

Virtual instruments “tracks” are not traditional digital audio tracks, but instrument plug-ins, triggered by MIDI data. The instruments exist in software. You can play a virtual instrument in real time, record what you play as data, edit it if desired, and then convert the virtual instrument’s sound to a standard audio track—or let it continue to play back in real time.

Virtual instruments are based on computer algorithms that model or reproduce particular sounds, from ancient analog synthesizers, to sounds that never existed before. The instrument outputs appear in your DAW’s mixer, as if they were audio tracks.

Why MIDI Tracks Are More Editable than Audio Tracks

Virtual instruments are being driven by MIDI data, so editing the data driving an instrument changes a part. This editing can be as simple as transposing to a different key, or as complex as changing an arrangement by cutting, pasting, and processing MIDI data in various ways (fig. 2).

Figure 2: MIDI data in Ableton Live. The rectangles indicate notes, while the line along the bottom show the dynamics for the various notes. All of this data is completely editable.

Because MIDI data can be modified so extensively after being recorded, tracks triggered by MIDI data are far more flexible than audio tracks. For example, if you record a standard electric bass part and decide you should have played the part with a synthesizer bass instead, or used the neck pickup instead of the bridge pickup, you can’t make those changes. But the same MIDI data that drives a virtual bass can just as easily drive a synthesizer, and the virtual bass instrument itself will likely offer the sounds of different pickups.

How DAWs Handle Virtual Instruments

Programs handle virtual instrument plug-ins in two main ways:

  • The instrument inserts in one track, and a separate MIDI track sends its data to the instrument track.
  • More commonly, a single track incorporates both the instrument and its MIDI data. The track itself consists of MIDI data. The track output sends audio from the virtual instrument into a mixer channel.

Compared to audio tracks, there are three major differences when mixing with virtual instruments:

  • The virtual instrument’s audio is typically not recorded as a track, at least initially. Instead, it’s generated by the computer, in real time.
  • The MIDI data in the track tells the instrument what notes to play, the dynamics, additional articulations, and any other aspects of a musical performance.
  • In a mixer, a virtual instrument track acts like a regular audio track, because it’s generating audio. You can insert effects in a virtual instrument’s channel, use sends, do panning, automate levels, and so on.

However, after doing all needed editing, it’s a good idea to render (transform) the MIDI part into a standard audio track. This lightens the load on your CPU (virtual instruments often consume a lot of CPU power), and “future-proofs” the part by preserving it as audio. Rendering is also helpful in case the instrument you used to create the part becomes incompatible with newer operating systems or program versions. (With most programs, you can retain the original, non-rendered version if you need to edit it later.)

The Most Important MIDI Data for Virtual Instruments

The two most important parts of the MIDI “language” for mixing with virtual instruments are note data and controller data.

  • Note data specifies a note’s pitch and dynamics.
  • Controller data creates modulation signals that vary parameter values. These variations can be periodic, like vibrato that modulates pitch, or arbitrary variations generated by moving a control, like a physical knob or footpedal.

Just as you can vary a channel’s fader to change the channel level, MIDI data can create changes—automated or human-controlled—in signal processors and virtual instruments. These changes add interest to a mix by introducing variations.

Instruments with Multiple Outputs

Many virtual instruments offer multiple outputs, especially if they’re multitimbral (i.e., they can play back different instruments, which receive their data over different MIDI channels). For example, if you’ve loaded bass, piano, and ukulele sounds, each one can have its own output, on its own mixer channel (which will likely be stereo).

However, multitimbral instruments generally have internal mixers as well, where you can set the various instruments’ levels and panning (fig. 3). The mix of the internal sounds appears as a stereo channel in your DAW’s mixer. The instrument will likely incorporate effects, too.

Figure 3: IK Multimedia’s SampleTank can host up to 16 instruments (8 are shown), mix them down to a stereo output, and add effects.

Using a stereo, mixed instrument output has pros and cons.

  • There’s less clutter in your software mixer, because each instrument sound doesn’t need its own mixer channel.
  • If you load the instrument preset into a different DAW, the mix settings travel with it.
  • To adjust levels, the instrument’s user interface has to be open. This takes up screen space.
  • If the instrument doesn’t include the effects plug-ins needed to create a particular sound, then use the instrument’s individual outputs, and insert effects in your DAW’s mixer channels. (For example, using separate outputs for drum instruments allows adding individual effects to each drum sound.)

Are Virtual Instruments as Good as Physical Instruments?

This is a question that keeps cropping up, and the answer is…it depends. A virtual piano won’t have the resonating wood of a physical piano, but paradoxically, it might sound better in a mix because it was recorded with tremendous care, using the best possible microphones. Also, some virtual instruments would be difficult, or even impossible, to create as physical instruments.

One possible complaint about virtual instruments is that their controls don’t work as smoothly as, for example, analog synthesizers. This is because the control has to be converted into digital data, which is divided into steps. However, the MIDI 2.0 specification increases control resolution dramatically, where the steps are so minuscule that rotating a control feels just like rotating the control on an analog synthesizer.

MIDI 2.0 also makes it easier to integrate physical instruments with DAWs, so they can be treated more like virtual instruments, and offer some of the same advantages. So the bottom line is that the line between physical and virtual instruments continues to blur—and both are essential elements in today’s recordings.

Developing MIDI applications on Android

About Android MIDI

In 2015 Android introduced MIDI support. Android has billions of users which means there’s a whole lot of people with MIDI compatible devices in their pocket!

In this article I’ll explore the most popular types of MIDI applications on Android and how to choose the right MIDI API when developing your own app.

Common types of MIDI app

 The two most popular types of MIDI app are:

1) Apps controlled by MIDI. These are apps which primarily receive MIDI data, such as synthesizers, drum machines and DJing apps.

2) MIDI controller apps. These apps are used to control other apps, external hardware or external software (such as a desktop audio workstation).

Apps controlled by MIDI

These apps receive MIDI data from hardware connected over Bluetooth or USB, or from a virtual MIDI service running on the phone itself.

The virtual analog synthesizer app, DRC can be played using an external MIDI keyboard  

 MIDI controller apps

These types of apps are designed to control other apps by sending MIDI to them. Examples include:

  • Using the phone’s touch screen as an X-Y pad which sends MIDI Control Change messages
  • Using the phone’s accelerometer to send MIDI pitch bend information
  • Step sequencing and piano roll apps

Super MIDI Box is a MIDI drum machine and step sequencer 

These types of apps are especially useful on Android because of how easy it is to use Android as a USB-MIDI device.

Android as a USB-MIDI device

A little-known feature of Android is that when connected to a USB host (like a computer) it can become a class compliant USB-MIDI device. This is done in the USB settings.  

Once the USB mode is changed to MIDI, it will appear on the host as a MIDI device with one input channel and one output channel. A MIDI controller app can then be used to control software, such as a Digital Audio Workstation (DAW), on the host.

Inside the app the MIDI device will show up as “Android USB Peripheral” with a single input port and a single output port.

DRC app connected to the Android USB Peripheral MIDI device

Android’s MIDI APIs 

 In order to develop a MIDI application on Android you’ll need to use one or both of the available MIDI APIs. Here’s a summary of the available APIs and their features:

JVM Midi

 In Android Marshmallow (API 23) the JVM Midi API was released. It can be accessed using Kotlin or the Java Programming Language. It enables you to enumerate, connect and communicate with MIDI devices over USB, Bluetooth LE and other apps advertising as virtual MIDI services.

Pros

  • Available on over 60% of all Android devices (correct as of July 10th 2019)

Cons

  • Sending MIDI messages to C/C++ code requires use of JNI which can introduce complexity due to multi-threading
  • MIDI data only available through callback mechanism

You must use this API for enumerating and connecting to MIDI devices, however, for reading and writing you can choose between this API and the Native Midi API described below.

Native Midi

In Android Q the Native MIDI API was released. It’s specifically designed for communicating with MIDI devices inside an audio callback which makes it easy to control audio rendering objects, for example a software synthesizer.

This API only handles sending and receiving MIDI data, not MIDI device enumeration or connection. That still needs to be done using the JVM API.

Pros

  • Easy to integrate with existing C and C++ code
  • Best possible latency
  • MIDI data can be received and sent inside an audio callback without blocking

Cons

  • Still need to enumerate and connect to MIDI devices in Kotlin or the Java Programming Language using the JVM MIDI API
  • Only available on Android Q and above 

Getting Started 

You can start developing a MIDI app using Android Studio. You’ll want to check out the following resources:

You should also take a look at the existing MIDI apps on the Google Play Store.

If you are using audio in your app you might want to check out the Oboe library for developing high performance audio apps.

If you have questions about developing MIDI apps on Android feel free to ask them on the android-midi group. Good luck and have fun!

Audiopedia 109 MIDI from Ask Audio

AudioPedia 109: MIDI

Ask Audio’s AudioPedia series is a comprehensive video dictionary of audio terminology. Created by audio expert Joe Albano, this encyclopedia of technical terms is the ultimate audio reference tool. Here are the topics covered and defined in the ninth installment of this authoritative series:

MIDI:

  • MIDI | Musical Instrument Digital Interface (preview below)
  • DIN (Cable) | USB (MIDI)
  • MIDI In | Out | Thru
  • Channel Voice Messages (preview below)
  • Velocity (preview below)
  • Pitch Bend
  • Aftertouch | Pressure
  • Control Change | CC
  • System Exclusive | SysEx

MIDI – Operation | Applications:

  • Quantization (MIDI)
  • Sound (Drum) Replacement
  • MIDI Controller
  • Touch | Velocity Sensitivity
  • Weighted | Unweighted Keyboard
  • Keyboard Split | Layering
  • Alternative MIDI Controller
  • MIDI Drum Kit
  • General MIDI | GM | SMF
  • MIDI Sequencer | Step Sequencer
  • Arpeggiation, Arpeggiator

Almost everyone who works with music encounters MIDI at some point, and many people use it every day. But aside from plugging in your USB keyboard, how much do you really know about what MIDI is and how it works? In the course AudioPedia 109: MIDI, Joe Albano gets to the heart of the MIDI protocol and explains it in a way that’s friendly and enlightening.

 


by Ask Audio

MIDI Channel messages

 

There are 7 MIDI channel voice messages.

  • Note on
  • Note Off 
  • Mono Pressure 
  • Poly Pressure
  • Program Change 
  • Pitch Bend 
  • Control Change 


Velocity

 

Velocity is the force with which a note is played, and essential to making MIDI expressive. 


...

AudioPedia Course: MIDI by Joe Albano : AskVideo

The NLE AudioPedia series, our video-based audio encyclopedia, is an invaluable resource for sound engineers, musicians, students, educators and all audio enthusiasts. This ninth installment is about MIDI terminology.

 

MIDI Enhancements in Windows 10

Recently Pete Brown from Microsoft posted a very informative blog post about MIDI Enhancements in Windows 10. 

The blog post covers a number of topics including 

  • UWP MIDI Basics – using MIDI in Windows Store apps
  • New Bluetooth LE MIDI support in Windows 10 Anniversary Update
  • The Win32 wrapper for UWP MIDI (making the API accessible to desktop apps)
  • MIDI Helper libraries for C# and PowerShell


“We’re happy to see Microsoft supporting the Bluetooth MIDI spec and exposing it to Windows developers through a simplified API. Using the new Win32 wrapper for the UWP MIDI API, we were able to prototype Bluetooth MIDI support very quickly.   At Cakewalk we’re looking ahead to support wireless peripherals, so this is a very welcome addition from Microsoft.”

by Noel Borthwick, CTO, Cakewalk

There is also a nice explanation in Pete’s article of how RPNs and NRPNs work and their data structure. 

With the release of Windows 10, all three of the major operating system companies (Apple, Google and Microsoft) have all developed support for the MIDI Manufacturers Association standard for MIDI over Bluetooth Low Energy that can be found in the specs section of this site and is available for download by MIDI Association members at no charge. 

Arduino MIDI Output Basics

Introduction

The Arduino UNO is a popular open-source microcontroller that, in many respects, is a perfect complement to the extensible nature of the Music Instrument Digital Interface (MIDI) protocol. Microcontroller platforms such as Arduino, Teensy, and others, make it relatively easy to develop custom MIDI controllers that respond to light, pressure, sound, and many other forms of input. In this way, microcontrollers provide a unique way to expand the possibilities of MIDI into the physical realm. MIDI musicians can now envision and create an astounding array of custom hardware devices from custom controllers to algorithmic composers and beyond.

Note that this article focuses on the basics of MIDI output on an Arduino UNO. Future articles will cover MIDI input on the Arduino and Teensy platforms as well as the use of potentiometers, switches, and other components for real-time MIDI control.

Transmitter Circuit

It is necessary to utilize a MIDI interface in order to send MIDI messages from an Arduino to a synthesizer, Digital Audio Workstation, or other MIDI device. Fortunately, it is easy (and inexpensive) to create a simple circuit that can handle MIDI output. The circuit can be mocked up on a solderless breadboard for experimentation or a permanent version can be soldered to solderboard. Users who are new to electronics might want to consider a commercial version such as the SparkFun MIDI Shield, offered at a nominal cost by SparkFun Electronics and other vendors.

As is evident in Figure 1, a circuit that was documented in David Miles Huber’s The MIDI Manual, the circuit for MIDI output is relatively simple and consists of:

  • a connection from the 5V pin of an Arduino through a 220-ohm resistor to pin 4 of a standard MIDI DIN jack,
  • a connection from the GND pin of an Arduino to pin 2 of a MIDI DIN jack,
  • a connection from the TX pin of an Arduino through a 220-ohm resister and 7404 Hex inverter to pin 5 of a MIDI DIN jack. 

Figure 2 demonstrates one way that the transmitter circuit could be configured on a solderless breadboard. Note that the top rail of the solderless breadboard is connected to the 5V pin on the Arduino and the bottom rail is connected to the Arduino GND pin.

MIDI Output Sketch: “Old School” Approach

While it is generally more convenient to use a MIDI library to program MIDI sketches on an Arduino, we will start with a low-level “pure” sketch in order to demonstrate how MIDI bytes are handled. If you have ever programmed MIDI applications for Windows, OS X, or Linux you are in for a pleasant surprise because MIDI output can be achieved with just a few lines of code on an Arduino. If you haven’t done so already, be sure to download the Arduino Software (Integrated Development Environment) from https://www.arduino.cc/en/Main/Software. Next, run the Arduino software and select File…New and enter the code that is described in the following paragraphs.

Boilerplate Code

While the basics of C and C++ programming are beyond the scope of this article (and covered in detail in my own Arduino for Musicians as well as numerous other books and online resources), rest assured that the basics of coding a simple MIDI sketch are not unduly difficult. Start by typing the functions shown in Listing 1 which form the basis for all Arduino sketches. Note that the term function is used to describe a block of “functional” code denoted by the function name and opening and closing braces:

Listing 1 Boilerplate functions


void setup()
{

}

void loop()
{

}

The setup() function is called once when your sketch is first run on an Arduino. You will use that block of code (between the opening and closing braces) to establish the serial transmission rate and any other initialization required by the sketch. The loop() function is where the action happens. As the name of the function implies, the loop() function continues to loop throughout the duration of your sketch unless you pause it with a delay() function or some other blocking activity.

Establishing a serial connection

To establish a serial MIDI connection between the Arduino and a MIDI receiver, add the code shown in Listing 2 to the setup() function. The Serial object represents a class (an object or pre-programmed chunk of code) that handles all of the low-level details of establishing and maintaining a serial connection. Note that the Serial class provides a function (typically called a method in the context of a class) titled begin() that takes the baud rate as a parameter. In this example, serial transmission is set to 31250 baud, the expected MIDI transmission rate as per The Complete MIDI 1.0 Detailed Specification (available from the MIDI Association at midi.org).

Listing 2 Setting up a serial connection


void setup()
{
Serial.begin(31250);
}

Writing a MIDI output function

Although there is nothing wrong with writing code for sending MIDI data in the loop() function, custom functions can help to produce code that is extensible and easier to read and maintain. Listing 3 demonstrates one approach to sending Note-On messages. Notice how the function takes three bytes that correspond to the MIDI channel, note, and velocity. The only tricky part of the code is the first line which translates the expected MIDI channel range of 1-16 to the range of Note-On status bytes starting at 0x90 (hexadecimal). The Serial.write() method is used to transmit the status byte and data bytes that form a Note-On message:

Listing 3 Custom function for outputting MIDI Note-On messages


void playMIDINote(byte channel, byte note, byte velocity)
{
//MIDI channels 1-16 are really 0-15
byte noteOnStatus = 0x90 + (channel-1);

//Transmit a Note-On message
Serial.write(noteOnStatus);
Serial.write(note);
Serial.write(velocity);
}

Outputting Notes

Now that a convenience function is available to handle the transmission of Note-On messages, it is easy to fill in some simple code in the loop() function to output a series of notes. Note that this example uses a blocking delay—generally a bad idea for more robust applications—but the use of timers is beyond the scope of this article and would only serve to obfuscate the underlying concept of sending MIDI data via a serial connection. In Listing 4, a “for loop” is used to output MIDI Note-On messages in the range of 60 to 72. The function delays and then the transmits the same note with a velocity of zero—which is functionally equivalent to sending a corresponding Note-Off message.

Listing 4 Outputting a chromatic scale


void loop()
{
//Play a chromatic scale starting on middle C (60)
for(int note = 60; note < 72; note++)
{
//Play a note
playMIDINote(1, note, 100);
//Hold the note for 60 ms (delay() used for simplicity)
delay(60);

//Turn note off (velocity = 0)
playMIDINote(1, note, 0);
//Pause for 60 ms
delay(60);
}
}

Uploading a Sketch to the Arduino

The complete sketch is shown in Listing 5. Once you have typed or copied the code into the Arduino Integrated Development Environment (IDE), click the leftmost check button to ensure that the sketch is free from errors. If you are relatively new to programming it might be helpful to remember that C code is case sensitive. It is also easy to omit an opening or closing brace or semicolon which can create any number of error messages. A final step is to connect the Arduino to your computer via a USB cable and select the upload button to upload the code to the Arduino. Assuming you have connected a valid MIDI output circuit, the chromatic scale should be received by any MIDI receiver device that is connected to the circuit via a MIDI cable.

Listing 5 Complete listing


void setup()
{
//Set up serial output with standard MIDI baud rate
Serial.begin(31250);

}

void loop()
{
//Play a chromatic scale starting on middle C (60)
for(int note = 60; note < 72; note++)
{
//Play a note
playMIDINote(1, note, 100);
//Hold note for 60 ms (delay() used for simplicity)
delay(60);

//Turn note off (velocity = 0)
playMIDINote(1, note, 0);
//Pause for 60 ms
delay(60);
}
}

void playMIDINote(byte channel, byte note, byte velocity)
{
//MIDI channels 1-16 are really 0-15
byte noteOnStatus=0x90 + (channel-1);

//Send notes to MIDI output:
Serial.write(noteOnStatus);
Serial.write(note);
Serial.write(velocity);
}

Coding Challenge

That’s it—Arduino MIDI output can be achieved with just a few lines of code! Consider how you might use the boilerplate code in this example to develop a simple algorithmic generator (perhaps using the Arduino random() function) or a sketch that outputs chords, exotic scales, or drum beats.

Next Steps

Although this introduction is necessarily limited, it forms the basis for many exciting possibilities including algorithmic composition, automation, and real-time control. As you will see in future articles, a basic MIDI output circuit can also be used for useful applications such as using a potentiometer to send continuous controller messages to a DAW or a two-axis joystick as an expressive real-time controller. As noted in the introduction, detailed coverage of Arduino MIDI and audio concepts are provided in Arduino for Musicians: A Complete Guide to Arduino and Teensy Microcontrollers as well as other books online resources.

Happy coding!

Ask.Audio Article on MIDI Messages

Ask.Audio and Non Linear Educating

Ask.Audio is one of our favorite technology websites and has been a great partner to The MIDI Association. We have worked with Ask.Audio’s parent company, Non Linear Educating to provide extensive video training right here on the MIDI.org website in our courses section.  Here’s a brief description on what NonLinear Educating is all about. 

Nonlinear Educating is an adaptive technology company dedicated to improving the way the world learns. The combination of our powerful and modular video-courseware production & distribution platform and our extensive library of industry leading training courses, has granted us the opportunity to empower a variety of partners from a multitude of industries. The foundationally modular approach to our application infrastructure enables us to rapidly customize instances of our platform to meet the specific needs of our partners. We are agile adaptive and are committed to developing the most efficient and robust video-learning platform on the internet.

by  Non Linear Educating

 The MIDI Association collaborates with many of the top technology websites including Ask.Audio, Electronic Musician, Harmony Central, Hispasonic, Keyboard Magazine, SonicState, Sound On Sound and more by mutually sharing information and stories about MIDI. 

Joe Albano, a well known author on Ask.Audio recently put together a great article on MIDI messages. We have Ask.Audio’s permission to summarize the content of their articles and then include a link to the full article.

Fig 1 The wonderful world of MIDI

M.I.D.I.—Musical Instrument Digital Interface—shook up the industry when it was introduced in 1983, by separating the player’s performance from the sound of the instrument, and this powerful digital communication protocol has been going strong ever since.

by Joe Albano

Joe’s article covers the basics about the most common MIDI messages

Channel Voice Messages

The bulk of the musical performance data of a MIDI recording falls into the message category of “Channel Voice Messages” (I’m going to ignore the old-school “Channel” designation here). The 7 Voice Messages are:

• Note-On

• Note-Off

• Monophonic (Channel) Pressure/Aftertouch

• Polyphonic (Key) Pressure/Aftertouch

• PitchBend

• Program Change

• Control Change (or Continuous Controller) messages, a.k.a. CC messages, of which there 127 

Fig 3 Strings of continuous (streaming) MIDI messages

Below is a link to the full article on Ask.Audio’s website. 


...

Everything You Need To Know About MIDI Messages But Were Afraid To Ask : Ask.Audio

MIDI. There’s a lot of musicians and producers who don’t know how to use this protocol to improve their musical performances and add more expression to their in

Here is a link to our collection of MIDI and Audio curriculums developed in cooperation with Nonlinear Educating. 


...

Massive Online Courseware Library : The MIDI Manufacturers Association : NonLinear Educating

Nonlinear Educating is an adaptive technology company dedicated to improving the way the world learns. The combination of our powerful and modular video-courseware production & distribution platform and our extensive library of industry leading training courses, has granted us the opportunity to empower a variety of partners from a multitude of industries. The foundationally modular approach to our application infrastructure enables us to rapidly customize instances of our platform to meet the specific needs of our partners. We are agile adaptive and are committed to developing the most efficient and robust video-learning platform on the internet.

Check out our MIDI & Audio courses. Sign up at NLE to get access to hours of free preview videos or take it to the next level and get MIDI Certified.

“What is MIDI” Guide by Paul Lehrman

Join the MIDI Association and download the 21 page booklet “What is MIDI” excerpted from “MIDI for the Professional” by Paul Lehrman and Tim Tully. 

{edocmanlink 52} 

Paul Lehrman showing Herbie Hancock the joy of MIDI.

Paul D. Lehrman, PhD, composer, author, consultant, and educator, is one of the world’s leading experts on MIDI and computer music.

He was the creator of the first all-MIDI album, The Celtic Macintosh (1986), and has had compositions commissioned by Newcomp, the Boston Computer Society, the Society for Small Computers in the Arts, the Audio Engineering Society, and UMass Lowell.

He has written over 500 articles on music technology for publications including Wired, New Media, Keyboard, Electronic Musician, EQ, Piano & Keyboard, Sound on Sound, the Boston Globe, the Boston Phoenix, Technology Illustrated, Studio Sound, Oui, High Times, Millimeter, and Recording Engineer/Producer, and from 1996 to 2008 was the “Insider Audio” columnist for Mix magazine. He was also the web editor for the Technical Excellence and Creativity (TEC) Awards from 1997 to 2015.

He served three terms as executive director of the MIDI Manufacturers Association during which time he contributed to the development of MIDI Time Code, MIDI Machine Control, and General MIDI. 

Paul Lehrman is the Director of the program in Music Engineering at Tufts University.

Paul worked with Eric Singer from the League of Electronic Musical Urban Robots (LEMUR) worked together on Anthiel’s Ballet Mecanique which was performed at Carnegie Hall and at the National Gallery of Art among many other performances around the world. 


Links to more information on Paul Lehrman on the web


...

Nintendo’s Wii Remote As A MIDI Controller |

Nintendo’s ultra-affordable Wii Remote controller can sense movement in every direction, and even knows at what angle you’re holding it. Its potential as a music controller is unlimited, and you don’t even need a Wii console to use it!


...

Metal Machine Music |

George Antheil’s Ballet Mécanique is all but impossible for human musicians to play. So for his latest presentation of it, Paul D. Lehrman built an orchestra of robots…

MIDI Tutorials

Here is a collection of tutorials on MIDI which is a great place to get started. 

For quick access to a list of all MIDI Messages, see our Reference Tables section.

MIDI Basics 


...

Why MIDI Matters –  

MIDI is like air, it’s all around you, most of the time you can take it for granted, but if you are a digital musician you probably couldn’t live with out it. MIDI is inside musical instruments, computers, tablets, smartphones, stage lighting,

1


...

An Intro to MIDI –

What MIDI does, how to use it, how devices are connected, and more. pdf File Name: intromid_20151101-180649_1.pdf File Size: 2.1

 


...

“What is MIDI” Guide by Paul Lehrman –  

Join the MIDI Association and download the 21 page booklet What is MIDI excerpted from MIDI for the Professional by Paul Lehrman and Tim Tully. 


...

Tutorial: Benefits of MIDI –

Unlike audio formats like MP3 files and CDs, MIDI files contain individual instructions for playing each individual note of each individual instrument. So with MIDI it is actually possible to change just one note in a song, or to re-orchestrate an entire song wit

4


...

Tutorial: MIDI and Music Synthesis –

An explanation of music synthesis technology and how MIDI is used to generate and control sounds. pdf File Name: MIDIandMusicSynthesis.pdf F

 

The History of MIDI


...

The History Of MIDI –  

We put together a series of articles about the history of electronic music and MIDI. 

About MIDI tutorial series 


...

About MIDI-Part 1:Overview –

MIDI (pronounced mid-e) is a technology that makes creating, playing, or just learning about music easier and more rewarding. Playing a musical instrument can provide a lifetime of enjoyment and friendship. Whether your goal is to play in a band, o

 


...

About MIDI-Part 2:MIDI Cables & Connectors –

Part 2: MIDI Cables &Connectors
Many different transports can be used for MIDI messages. The speed of the transport determines how much MIDI data can be carried, and how quickly it will be received.


...

About MIDI-Part 3:MIDI Messages –

Part 3: MIDI MessagesThe MIDI Message specification (MIDI Protocol) is probably the most important part of MIDI. MIDI is a music description language in digital (binary) form.

2


...

About MIDI-Part 4:MIDI Files –

Standard MIDI Files (“SMF” or *.mid files) are a popular source of music on the web, and for musicians performing in clubs who need a little extra accompaniment. The files contain all the MIDI instructions for notes, volumes, sounds, and even effects

 


...

Basics of USB-MIDI –  

USB MIDI 2.0 ADOPTED MIDI 2.0 Progress Continues with Updated USB Specification – As computers have become central components in many MIDI systems, USB has become the most widely used protocol for transporting MIDI data. 


...

Audio Wiki MIDI Message diagram –

Our friends over at http://en.wikiaudio.org created this simplfied diagram of MIDI messages and kindly let us put it up here.


...

Ask.Audio Article on MIDI Messages –  

Ask.Audio and Non Linear Educating Ask.Audio is one of our favorite technology websites and has been a great partner to The MIDI Association. 


...

Mixing with Virtual Instruments: The Basics –  

DAW software, like Ableton Live, Logic, Pro Tools, Studio One, etc. isn’t just about audio. Virtual instruments that are driven by MIDI data produce sounds in real time, in sync with the rest of your tracks. 


...

Sound on Sound MIDI Basics, Part 1 –  

Paul&nbsp;White, editor of&nbsp;Sound on Sound&nbsp;wrote&nbsp;a series of articles in 1995&nbsp;for newcomers to MIDI.&nbsp; You might imagine that most SOS readers already have a pretty firm grasp of MIDI, but new readers are joining us every month


...

Sound on Sound MIDI Basics, Part 2: Sequencing –  

Paul White’s beginners’ guide to MIDI continues. This month, he explains the concept of MIDI sequencing.To anyone used to playing and recording using traditional methods and skills, the MIDI sequencer is sometimes viewed as little short of cheat

​Tutorial: Benefits of MIDI

 Unlike audio file formats like MP3 files and CDs, MIDI files contain individual instructions for playing each individual note of each individual instrument. So with MIDI it is actually possible to change just one note in a song, or to re-orchestrate an entire song with entirely different instruments. And since each instrument in a MIDI performance is separate from the rest, its easy to isolate individual instruments and study them for educational purposes, or to mute individual instruments in a song so that you can play that part yourself. Here are just some of the tangible advantages of using MIDI for active music-making.

Play with “a band”

Learning to play a musical instrument is one of the most rewarding things people can do. But why play by yourself when you can play along with a band? Standard MIDI Files are available for many popular songs, and when used with a personal computer or digital piano make it possible to have an entire backing band play along with you at whatever speed (tempo) and in any pitch (key) you desire. MIDI files are perfect for practicing with, as well as for performing when additional musicians are not available.

Correct Your Performance

A MIDI sequencer or a ‘Digital Audio Workstation’ can record your performances for listening at a later time, and even save your performance in Standard MIDI File format for playback on other MIDI systems. This is a great way to evaluate your own progress, or even to study how someone else plays.Better yet, because all MIDI data is editable, you can edit out any imperfections! If you play a wrong note, you can just change it using the Sequencer’s editing tools. And if you find you just can’t play fast enough to keep up with the tempo, you can slow it down for recording and speed it back up for playback — without the “Mickey Mouse” effect that normally comes from speeding up a song.

Play Any Instrument

When you use MIDI to make music, you aren’t limited to playing just one instrument. No matter what sort of MIDI Controller (keyboard, guitar, wind, drums, etc.) you actually use, you can make it sound like just about any instrument you can imagine (and some that are only in your imagination). Most digital pianos and other MIDI instruments come with hundreds of different sounds (pianos, trumpets, violins, guitars, basses and more) which you can play yourself or play via a MIDI sequencer to create fully orchestrated music. Now software synthesizers add incredible possibilities and most contemporary music is produced on computers and tablets using softsynths controlled by MIDI controllers. 

Arrange and Orchestrate

Many people enjoy arranging and orchestrating music as much as performing it. There are MIDI files available for songs from every style of music — as well as software programs that generate the basic rhythm and chord patterns that define specific styles — that you can use to create your own arrangements and orchestrations. Just change the instrumentation, add a verse or chorus here or there, even put in your own original phrase or section — all of this is easy to do with MIDI. You can also share your arrangements with other people, who can then rearrange them to fit their own needs — many people download MIDI files from the Internet and rearrange them to fit their own needs.

Print Sheet Music

When you are done creating your own performance or arrangement, if you have a personal computer, you can convert MIDI information into musical notation and print out actual sheet music. Even if you can’t play a note, MIDI Notation programs often make it possible to place notes on a musical staff using your mouse or computer keyboard. There are Notation programs available for every level and pocket book — from professional engraving to casual use.

Compose Music

If you’ve ever had an original song idea in your head and wished you could have it performed, MIDI is the way to do it. All you need is a MIDI Sequencer plus a MIDI instrument to enter notes with. (You can also use MIDI Notation software to place notes on a musical staff without playing them at all.) You can start with just a melody and then add backing chords, bass, and rhythm later, or add instruments in any order you like. If you make mistake, you can change it without having to play the part all over again. You can also make entire sections repeat without playing them again. And you can rearrange and re-orchestrate your song as many times as you like.

Sound on Sound MIDI Basics, Part 2: Sequencing

Paul White’s beginners’ guide to MIDI continues.

This month, he explains the concept of MIDI sequencing.To anyone used to playing and recording using traditional methods and skills, the MIDI sequencer is sometimes viewed as little short of cheating, but to the sequencer user, MIDI and sequencing are seen as practical tools that make complex multi­part composition and performance a reality. Before MIDI appeared, few people could compose a symphony (or pop song, for that matter) and ever expect to hear it performed; now almost anyone can turn their musical ideas into a performance using affordable technology. Before exploring the mechanics of sequencing, however, I’d like to tackle the idea that sequencing is somehow ‘cheating’ by looking at how things were done before the introduction of MIDI.

WHAT DID WE DO BEFORE MIDI?

Having never personally written a symphony, I can’t detail the exact stages involved, but I expect it goes something like this… 

The composer sits at his or her chosen instrument testing musical ideas, and the ones that make it are then written down on manuscript paper for the various sections of the orchestra to play. The composer visualises (or should it be auralises?) the parts already written down while adding new sections, harmonies and so forth. Then, when the music is nominally finished, the score will be scrutinised and any required alterations or adjustments will be made. 

Once the score is complete, an orchestra will be hired and given copies of the score, and the music will be played back as written by the composer. The composer, who may not even be able to perform to an acceptable standard on even one instrument, has conceived a piece of music and then written a list of instructions in the form of a musical score in order that a musically proficient orchestra can perform it. But has anyone ever accused Stravinski or Beethoven of being cheats, because they couldn’t play all the orchestral instruments themselves? I think not. 

In contrast, let’s see how the MIDI composer writes. As with the orchestral composer, the work usually starts at the keyboard, but this time the keyboard is a MIDI instrument connected to a MIDI sequencer. Instead of writing down a score, the composer will record sections of the music into the sequencer against an electronic metronome set to the desired tempo. Instead of scanning a score to verify what’s been done, it’s a simple matter to play back the MIDI sequence to hear exactly what has been recorded. Best of all, you don’t have to hire in an orchestra ­­ a relatively inexpensive multitimbral synthesizer will provide all the sounds for you; each ‘part’ of the multitimbral synth plays one line of your electronic score. 

In some ways, the sequencer is better than the written score, because it can play back a part exactly as you played it in the first place ­­ it doesn’t necessarily have to ‘quantise’ everything to equal subdivisions of a musical bar, as the written score does. And, just like the written score, if you’re unhappy with something you’ve done, you don’t have to start from scratch; you just erase the unwanted notes and ‘write’ in new ones. 

When you summarise the way a musician composes using a sequencer, it isn’t really too different from the way a traditional composer works. Both types of composer are likely to edit their compositions to some degree before they’re entirely happy with them, and both bring in performers to play the finished composition. It doesn’t really matter whether the finished piece is played by a bank of synths or by a hired orchestra whose role is simply to reproduce the composer’s original work as faithfully as possible. My verdict, then is that electronic composition is as legitimate as any other form of composition. Note that I have no intention of fuelling the ‘synths versus real instruments’ debate at this point. If you have the talent to write a major symphonic work using synths, you can always get your computer to print out the score and have a real orchestra play it for you later! 

Having covered the philosophical groundwork, it’s now time to look more closely at the MIDI sequencer. 

What exactly is a sequencer? It’s often convenient to visualise a sequencer as being analogous to a multitrack tape recorder, and indeed, the ‘layers’ or parts of a sequence are recorded onto tracks, but it is vitally important to understand that what is being recorded is not the sound itself, but the electronic equivalent of a musical score. Just as a musical score is a series of instructions to the musicians, a MIDI sequence holds a series of instructions which tell your synths what to play. In some ways, a better analogy might be the player piano or pianola, where a punched paper roll holds the instructions that make the piano play, except in the case of MIDI, you have a multitrack, a virtual ‘paper roll’ capable of controlling many instruments at the same time. 

In a typical setup, a MIDI instrument (usually, but not invariably, a keyboard) is connected to a sequencer via a MIDI cable, and when the sequencer is set to record, any notes played on the keyboard are recorded as MIDI data into whichever sequencer track has been selected for recording. In a simple system, you might have 16 MIDI tracks set up so that each is on a different MIDI channel, and if you feed the MIDI output of the sequencer to a 16­part multitimbral sound module, you can play back all 16 tracks at once. If you only have an 8­part multitimbral module, then you can only play back eight different sounds at once, in the same way that a real­life string quartet can only play four lines of music at the same time. 

To avoid having to switch the MIDI send channel on your keyboard every time you want to record onto a new sequencer track, modern sequencers convert the incoming MIDI data to the appropriate channel for the track you’re recording on. This makes life very easy, because once you’ve completed one track, all you need do is select the next one and carry on playing. 

The remaining capabilities of a MIDI sequencer bear more resemblance to a word processor than anything else. Like a word processor, you can delete or replace wrong characters (in this case, musical notes) and if you want to use the same phrase more than once, you can copy it and paste copies into new locations to save having to do the same thing lots of times. For example, if a song has the same structure for each chorus, you only need play the chorus once, then copy it to any place in the sequence where you’d like another chorus to appear. 

Of course, there’s more to MIDI data than notes, and a sequencer will record just about any MIDI data you throw at it, with the exception of MIDI clock ­­ a sequencer has its own timing clock. Nevertheless, you can synchronise a sequencer to an external source if you wish, such as a tape machine (via a suitable sync box) or to a MIDI drum machine. 

Unless you deliberately filter out certain types of MIDI data, you’ll find that your sequencer captures Note On/Off, Pitch, Velocity, Aftertouch and Controller information as well as MIDI Program Changes and even System Exclusive (SysEx) data. If these terms are unfamiliar, fear not ­­ we’ll be looking at Controllers and Program Change information next month. There’s even less need to worry about the concept of SysEx data at this point, but it is useful to know that it is possible to record a SysEx dump of all your synth sounds at the start of a song, so that when you first play the sequence, your synths are automatically loaded up with the appropriate set of new sounds to play that particular musical sequence. 

A MIDI Program Change command recorded during the count­in period of a track will ensure that the connected synth switches to the correct sound patch before playback commences, but you can also insert Program Changes part way through a track (as many times as you like) if you want the sound to change for, say, a solo. This is the orchestral equivalent of writing a note on the score at a certain bar number to tell a violin player to put down his violin and play the next part on a flute! This isn’t something you’d usually do in real life, but a MIDI sound module is equally proficient on all instruments and, as yet, MIDI modules don’t have trade unions! 

When your sequence is played back, the sequencer transmits the MIDI information to the receiving synth(s) ­­ or sampler, drum machine, and so on ­­ in exactly the same order, and with the same timing as you originally played it. If you so wish, you can change the tempo after recording without affecting the pitch (unlike a tape recorder, where you’re dealing with sound rather than MIDI data). If you’re still not sure why the pitch doesn’t increase as the tempo goes up, think back to the orchestra and score analogy; if the conductor asks for a piece to be played faster, the orchestral instruments don’t change in pitch. Similarly, if you pedal a pianola faster, the paper roll will be played faster but the piano’s tuning will remain the same. 

In reality, MIDI does has a finite timing resolution, because the sequencer or computer sending the MIDI information has to work to an internal timing routine based on an electronic clock. However, in practice, MIDI is far more accurate than a typical human performer, and is capable of resolving a bar of music into at least 960 time divisions, and frequently more.

SEQUENCER TYPES

All MIDI sequencers are based on computer technology, but you have a choice of buying a sequencer system that runs on an existing computer (such as an Atari ST, Apple Mac, Apple Power Mac, IBM­ compatible PC or Commodore Amiga) or opting for a piece of dedicated hardware where everything you need is built into one box. The two types work in a similar manner ­­ what tends to vary is the way in which the recorded information is displayed, and how easily it can be edited.

For those who are relatively accomplished players, hardware sequencers (like the Roland MC500) offer the benefits of simplicity and convenience, but they rarely have the information display capability of a full­size computer screen. And because there’s no computer mouse, editing is generally less comprehensive and more time­consuming than it would be on a computer­based system. However, the recording process is usually just a matter of hitting the Record button and playing. A significant benefit of hardware sequencers is that they are more practical in live performance situations; they are more compact and more rugged than a computer­-plus-­monitor, and you have fewer things to plug in.

“Has anyone ever accused Stravinsky or Beethoven of being
cheats, because they couldn’t play all the orchestral instruments
themselves? I think not.”

by Paul white, Sound on Sound

Some MIDI sequencers (including all the compute-r­based ones) lose all information stored in their memory (RAM) when they are switched off, so it is vital to save your work to disk at regular intervals. The MIDI information which makes up a completed song can normally be saved onto either floppy or hard disk in the form of a song data file, and a single floppy disk will hold several songs of average complexity. Many hardware sequencers also have a built­in disk drive, allowing songs to be saved as files on floppy disk, though some of the less expensive models (such as the Alesis MMT8) use battery-­backed-­up memory instead of disks. Once the memory is full, however, you either have to save your work to a MIDI data filer (which has an in­built disk drive) or throw away your old project before you can start a new one. Usually, this kind of sequencer can only store a few songs at a time.

The computer­-based sequencer is capable of more sophistication than most hardware models, which means there may be a steeper learning curve. However, this is more than made up for in my opinion by the amount of visual feedback available, especially when it comes to tasks like song arrangement.

Unless you’re using an Atari ST, which has a built­-in MIDI connections, you’ll have to buy an external MIDI interface box, though some of the newer GM synth modules (such as Yamaha’s MU50) come with PC and Mac MIDI interfaces built in. MIDI interfaces for Apple Macintosh computers usually plug into the modem or printer ports on the machine, while PC users need an interface card which goes inside the computer. A basic sequencing setup is shown in Figure 1, and to keep things simple, I’ve depicted a ‘dumb’ master MIDI keyboard; if you have a MIDI keyboard that includes a sound generation section, simply select Local Off and connect it up like any other synth module.

The majority of the leading software sequencing packages have adopted the style of user interface pioneered by Steinberg in their Cubase software. This typically comprises a main screen page, which handles the basic ‘recording’ and arranging, plus a number of further pages which address various aspects of editing and, where applicable, scoring. The record and playback controls are designed to look something like a tape recorder’s transport control, and the edit pages usually allow you to examine (and change) the recorded data in several ways: (i) as a list of MIDI events; (ii) graphically, in the guise of a ‘piano roll’ display; or (iii) in the case of ‘score’ versions, in the form of a conventional musical score.

Some software sequencers include sophisticated scorewriting facilities which enable you to print out sheet music for your compositions, in which case you’ll need a printer that is compatible both with your computer and the software package. However, some musical literacy is useful, because the computer doesn’t always interpret what you play in the same way that a trained scorewriter would.

Editing

From the editing pages of a typical sequencer, you can change the value, timing, and velocity of any of the notes you’ve played. Alternatively, you can build up compositions manually, by placing new notes onto the quantise grid in non-­real­ time, rather like writing out a manuscript. The non-­real­time entry of note information may also be referred to as step-­time entry.

A number of related non-­destructive (ie. the operation is not permanent and can be reversed) editing options are sometimes available, including the ability to transpose your music, either as you play or after recording. You also usually have the ability to make the music louder or softer by adjusting the overall velocity. On some systems, you can even compress the dynamic range of your MIDI data to even out the difference between your loudest notes and the quietest ones, as well as delay or advance tracks relative to each other (to make timing adjustments). This is frequently achieved by recalculating the note data during playback, but the real data isn’t changed, so you can always revert to your original performance data.

MIDI DRUMS

t is possible to sequence the sounds from your drum machine just as you can any other type of MIDI sound module, but remember to turn off the drum machine’s external MIDI sync first, otherwise every time you start your sequencer, the drum machine’s internal patterns will start to play. Unlike a conventional instrument, where each note on the keyboard plays a different pitch of the same sound, drum machines place different sounds on different keys, allowing access to many varied drum sounds. Because it’s difficult to play a complete drum part in one go via a keyboard, it is common practice to spread the drum part over several sequencer tracks ­­ enabling you to record, say, your bass and snare first, your hi­hats next and finally your fills. This method of working makes it easy to edit your drum tracks later, without having to work out what note corresponds to what drum sound. And once the drum part is completed, of course, you can always merge the drum tracks into one for convenience. Most sequencers offer a suitable track merge function these days.

SUMMARY

MIDI sequencers are very powerful tools both for music composition and recording, and because they have grown so sophisticated, there are still a great many features that I haven’t discussed. For example, MIDI allows you to remotely control the volume of your instruments, so by recording MIDI Volume information (Controller 7) in your sequences, you can create automated mixes.

Wonderful though sequencers are, they are still far from perfect. Aside from the inevitable software bugs that creep in, they tend to force you to work in a way that you probably wouldn’t adopt if you were playing and composing conventionally. Most insidious is the metronome or tempo click that you have to play along to, and although you can turn this off and record ‘freestyle’ regardless of bar positions, you won’t be able to quantise your data (for an explanation of quantisation, see the box elsewhere in this article), and you won’t be able to print out a meaningful score. This means that tempo changes have to be planned ahead rather than being intuitive. Although software designers are now including features to help you in this area (such as re­barring), it takes a lot of determination to move away from the fixed tempo, four-to­the­ bar, music that we’ve all become so accustomed to.

Despite the pitfalls mentioned, MIDI sequencing still offers far more advantages than disadvantages, and used creatively, it makes many things possible that would have been far too impractical or expensive in the pre­-MIDI era. And finally, don’t think that sequencing is difficult ­­ once you’ve made a start and seen how easy it is to handle the basics, you’ll wonder why the manuals ever needed to be so thick!

MIDI AND SYNCHRONISATION

MIDI sync was covered in some depth in Part 1 of this series, but it is useful to recap here on the main points.

Sequencers with integral hard disk recording facilities offer a great way of combining audio with MIDI, but they still tend to be expensive and there’s also the problem of backing up very large audio data files. Because of this, most people still use multitrack tape, but there’s no advantage in recording your sequenced material to tape if you can find a way of making the sequencer run in sync with your multitrack.

The easy answer is to record some form of MIDI sync code onto tape. This means you give up one tape track to record the necessary sync code, but you gain as many ‘virtual tracks’ as your sequencer and synth/sound module collection can provide. The simplest way to achieve this is to use a ‘Smart FSK’ MIDI­ to­tape sync box which you can buy for as little as £100. These use both MIDI Clock and MIDI Song Position Pointers to ensure that your sequencer starts at the right time and remains in perfect synchronisation with your multitrack, regardless of whether you play the tape from the start of the song or from half way through. For more on MIDI sync, FSK, and Song Position Pointers, take a look at the article ‘Synchronisation Explained’, starting on page 186 of July ’94’s SOS.

SEQUENCER FEATURES

What basic features can you expect from a MIDI sequencer? Obviously every sequencer is different, but all should be capable of the following core functions.

  • REAL­TIME RECORDING: You play in your MIDI data from a keyboard and record just as you would with a tape recorder. Unlike a tape recorder, you can transpose, change tempo and quantise your data after recording. If you want to use the quantise feature, you have to play to the internal metronome track when recording.
  • STEP­ TIME RECORDING: Notes are played in one at a time ­­ it’s rather like typing a letter with one finger! You decide where the notes go and how long they’re going to be, after which you can play back your work at any tempo. Most people mainly use real­time recording with occasional recourse to step-­time when the going gets tough. With a piano­roll type of editing screen, you can also ‘draw’ your notes directly onto the quantise grid and then use the mouse pointer to ‘stretch’ them to the desired note length. 
  • EDIT: A typical sequencer will have a variety of editing tools to enable you to change your composition once you have recorded it. These range from the ability to change individual notes to the ability to change entire arrangements and swap instruments.
  • QUANTISATION: This is the ability to move your notes to the nearest accurate subdivision of a bar (for example, 16ths of a bar). See ‘Quantisation’ box elsewhere in this article for more details. 
  • TRANSPOSITION: Notes can be transposed by any amount without altering their lengths, while entire compositions or sections of compositions can easily be shifted to a different key. 
  • COPY/CUT/PASTE: Any section of music can be copied to different tracks or to different locations in the song. This is useful for duplicating repeated sections, such as a chorus, or for doubling up a line of music on two tracks by assigning them to different instrument sounds. Cut allows unwanted material to be removed. 
  • ARRANGING: This may be implemented in a separate part of the program, as in C­Lab’s Notator/Creator, or it may be handled using Cut/Copy/Paste as in the case of Steinberg’s Cubase, EMAGIC’s Logic and Opcode’s Vision. 
  • MUTE: Most sequencers allow you to mute tracks, or solo a track, so that you can hear it in isolation. • CYCLE: Simply allows you to continually loop around a specific section of music while you record or edit. 
  • UNDO: Computer­-based sequencers, and some hardware sequencers, have an Undo function which, as the name implies, lets you undo the last seemingly permanent thing you did.

QUANTISATION

One important feature common to both hardware and software sequencers is the ability to quantise data after recording ­­ a useful feature for those musicians not possessed of a perfect sense of timing. Essentially, when you choose to quantise something, the timing is changed so as to push each note you’ve recorded to the nearest exact subdivision of a bar. For example, if you are working in 4/4 time, and you select 16 as your quantise value, all the notes becomes locked to an invisible grid which in effect divides the bar into 16 equal time slots. Quantise must be used carefully as it can strip all the feel from some types of music; however, if you’re doing dance music where precise timing is essential, the quantise feature is indispensible.

The most recent computer­-based sequencers allow you to un-quantise data as well as to quantise it, but be aware that some less advanced software sequencers and a number of hardware sequencers perform what is known as destructive quantise. So if you think you might need to go back to the original version, it’s vital that you keep a copy of the sequence.

Another feature which I find really valuable is what is usually referred to as percentage quantise. Using this, you don’t have to make all your notes snap to the quantise grid; instead, by setting a quantise value of say 50%, you can have your notes moved to a position that’s halfway between where you originally played them and the nearest time slot in the quantise grid. This is great for tightening up your playing without losing all the feel.

Yet another quantise-­related function is swing, where the quantise grid is moved away from regular slots to alternating longer and shorter slots. This can be used subtly to add feel or used more aggressively to turn a 4/4 track into a 2/4 track, say. It’s now even possible to load in third­party groove templates (such as DNA Grooves) created from the timing of real players.

HARD OR SOFT?

Software sequencers have several obvious advantages over hardware sequencers, but that doesn’t mean that they’re better ­­ it all depends on what facilities you need and whether you want your sequencer to be portable.

SOFT OPTION

The main pros of software sequencers are as follows:

  •  A good visual interface. 
  • More comprehensive editing facilities. 
  • You can still use the computer for other purposes. 
  • You’re not tied to one manufacturer for software upgrades ­­ if somebody comes out with a better program, you can always move over to it.
  • Most computer sequencers support multiple MIDI output ports via a special multi­port MIDI interface. This means you are not restricted to 16 MIDI channels and a typical system will provide six output ports, giving you up to 96 MIDI channels to work with.
  • The most popular sequencer software packages now allow you to transfer song data from one computer platform to another and, in some cases, even from one manufacturer’s software sequencer to another’s
  • Professional standard score printing is available from many sequencing packages, using either an inexpensive ink­jet printer or a laser printer.

Hard Option

Hardware sequencers have their advantages too, the main ones being listed below: 

  • One­-box solution to sequencing.
  • Generally more reliable than computers in live situations or when being moved from studio to studio. 
  • Although they may have fewer editing options than a software sequencer, they also tend to be easier to use.
  • You don’t have to learn to use a computer before you can begin to learn your sequencer software.

A basic MIDI sequencing setup starts at your keyboard ­­ it’s here that the MIDI information to be recorded originates. The master keyboard is connected via its MIDI Out socket to the MIDI In of your MIDI interface, or directly to the MIDI In of your hardware sequencer or Atari ST. As mentioned earlier, if your keyboard includes a synth section (in other words, if it makes sounds), then turn Local Off and patch a MIDI cable from the sequencer’s MIDI Out to the keyboard’s MIDI In.

If you have other MIDI modules in the system, you can daisy­chain them in any order by feeding the MIDI Thru of one piece of gear to the MIDI In of the next, as described last month. Up to three modules can normally be chained in this way without problems, but longer chains may cause stuck or missed notes (due to corruption of the MIDI signal), in which case you should use a multiple output MIDI Thru box connected to the output of your sequencer and then feed each module (or short chain of two or three modules) from separate outputs on this Thru box. MIDI Thru boxes were explained in Part 1 of this series, last month.

f you’ve connected up your system as described but no sound comes out, here are a few things you might want to check before you get into serious panic mode.

  • Check that everything is switched on and that your synth modules are set to Multi or Sequencer mode (assuming you want to use them multi-timbrally).
  • Check your MIDI cable connections and don’t rule out the possibility a faulty MIDI lead. Some modules
    have a combined MIDI Out/Thru socket; if so, ensure MIDI Thru is enabled (see handbook for that piece of
    equipment). To help narrow the problem down, most sequencers have some form of indication that they’re
    receiving MIDI data and many modules have a MIDI light or other indicator that blinks when data is being
    received.
  • Check that you’ve set the MIDI channels correctly on your modules and that Omni mode is switched off on all modules. If two or more instruments try to play the same part, the chances are you’ve either got more than one module set to the same MIDI channel or something’s been left set to Omni. If your master keyboard plays its own sounds when you’re trying to record using the sound of another module, check that Local Off is really set to Off.
  • If playing a single note results in a burst of sound, rather like machine gun fire, or if you get stuck notes or apparently reduced polyphony, the chances are you have a MIDI loop. In a MIDI loop, MIDI data generated by the master keyboard passes through the sequencer and somehow finds its way back to the input of the master keyboard, where it starts its round trip all over again, rather like acoustic feedback. This usually happens when you are using a keyboard synth as your master keyboard and have forgotten to select Local Off.

If you have one of those rare instruments with no Local Off facility, you’ll probably find that your sequencer allows you to disable the MIDI Thru on whatever channel your master keyboard is sending on (most people leave it set to channel 1).

If you are unfortunate enough to have neither facility, then all you can do is record with the MIDI In disconnected from your master keyboard and use the sounds from external modules. When you’ve finished recording, you can reconnect the master keyboard’s MIDI In, if you wish, and use it to play back one of the recorded parts or to layer with an existing synth voice.

Sound on Sound MIDI Basics, Part 1

Paul White, editor of Sound on Sound wrote a series of articles in 1995 for newcomers to MIDI. 

You might imagine that most SOS readers already have a pretty firm grasp of MIDI, but new readers are joining us every month. Furthermore, there are those amongst our existing readership who mainly record using traditional multitrack methods, and they too could benefit from a refresher course in MIDI practices. One of the problems is that many of the musicians who could reap the benefits of MIDI are frightened off by the jargon ­­ and there’s also the underlying suspicion that MIDI has something to do with computerising and dehumanising music. Furthermore, it’s not always clear what MIDI can actually help you achieve. But before looking at all the great things you can do with MIDI, is it true that MIDI is complicated? Technically, MIDI is quite complicated ­­ but then the same is true of TV, telephones, cars, and the insides of your hi­fi system. Even so, most of us take these things for granted and use them without giving a second thought to what really makes them tick. The ease of use of something doesn’t necessarily relate to the complexity of the technology that makes it work, and that’s certainly true of MIDI, because although it does provide the scope for you to do complicated things if you wish to, you can choose to approach it on your own terms and make use only of the facilities that you need.

WHAT IS MIDI?

MIDI is essentially a communications protocol or common language that enables any MIDI ­equipped electronic instruments to be linked together in a musically useful way. The data that makes this possible is in digital form, hence the acronym MIDI (Musical Instrument Digital Interface). Don’t worry if you don’t know how digital data works ­­ it doesn’t know how you work either, but that doesn’t mean you can’t work together! MIDI compatible instruments and other MIDI devices are connected to each other via standard MIDI cables, with 5­pin DIN plugs on either end. There are a few simple rules determining what should be plugged where, but what would really help at this stage would be to talk more about this mysterious ‘musically useful’ information that MIDI instruments send to each other. In reality, there are many types of MIDI message, and to try to grasp them all at once would probably give you a headache, so what I’m going to do is cover the essentials first (and if I have to bend the truth occasionally to keep things simple, it won’t do you any lasting damage!). Electronic keyboard instruments are, by definition, electronic, which means that the sound is created by circuitry, not by something being hit, bowed, or blown. Whereas a piano key activates a mechanism which hits the string, the keys on an electronic MIDI instrument generate electronic signals to tell the internal circuitry what note to play and how loud to play it. When a key is depressed, a signal known as a Note On message is transmitted, and when the key is released, a Note Off message is sent. The actually key that you depress dictates which musical note will be played, and the loudness of the note depends on how hard the key is hit ­­ which is really the same thing as saying how fast the key is pushed down. This speed, or velocity, is read by circuitry within the keyboard and used to control the volume of the sound being played. The term ‘velocity’ is one piece of MIDI jargon that crops up quite regularly. To recap so far, the main parameters of a musical note played from a keyboard are: which note it is, when it starts, when it stops, and how loud it is. There are other things that you can do to a note, such as bending the pitch or adding vibrato, but they’ll keep for now… If pitch, Note On, Note Off and Velocity information all exists in the form of electronic signals, it must be possible to send these signals along a piece of wire and use them to control the sound generating circuitry in another electronic instrument, and it’s precisely that concept which is at the heart of MIDI. (It might occur to you at this stage that you could send the same signals directly from a computer and cut out the middle man ­­ but that avenue of exploration comes later, when we look at sequencing.) The main point to get across to new users is that MIDI is not a means of transmitting audible sounds ­­ it is a means of transmitting instructions or messages. A good analogy might be to compare MIDI data with a written musical score; the score only tells the performer what to play, it doesn’t have any influence over the sound of the instrument. What’s more, you could read a score written for violin and choose to play it on a piano.

MIDI allows us to play any piece of music using any sound at our disposal. If we plug the MIDI Out of the keyboard we are physically playing (the Master keyboard) into the MIDI In socket of another MIDI instrument (the Slave), then the slave is able to play the notes as performed on the master keyboard. This simple arrangement is shown in Figure 1. Great ­­ but why would I want to do that? Well, when playing live, the ability to link a second instrument via MIDI means that the sounds of both instruments can be played without changing keyboards. Not so flash as wearing a gold cape and standing in front of tiered banks of Moog synths, perhaps, but far more practical. Indeed, only the master synth needs to have a keyboard at all ­­ the other MIDI devices can simply be sound modules, which certainly saves on space if you have to drive to a gig in a Metro, and it saves money. To understand how the control of multiple modules is possible without them all playing at once, all of the time, the concept of MIDI channels has to be introduced.

MIDI CHANNELS

In a basic MIDI system, the way the instruments are linked means they all receive the same MIDI information. In order to allow the master instrument to communicate with the slaves on a more selective basis, the MIDI Channel system was devised. There are 16 MIDI channels available, numbered 1 to 16, and they work in a very similar way to TV channels. Most people in the UK receive four TV channels (forget Sky just for now), yet all four channels arrive at the same aerial and reach the set down the same piece of wire. Which one we actually watch depends on which TV channel we select on the set. With MIDI, the information sent down the MIDI cable can be transmitted on any one of 16 channels selected on the master keyboard; similarly, the sound modules may be set to receive on any of the 16 channels. If we, for example, set the master keyboard to transmit on MIDI channel 1 and connect up three different MIDI modules set to receive on channels 1, 2, and 3, only the first module set to channel 1 will respond. The others still receive the information, but the MIDI data tells them that the information is not on their channel and so they ignore it. Of course, you can set all your modules on the same MIDI channel and have them all playing at once, if you want to. Putting it briefly, by switching channels at the master keyboard end, up to 16 different modules (set to the 16 different MIDI channels) can be addressed/played individually, even though they are all wired into the same system.

MULTITIMBRAL APPROACH

We’ve already discovered that a MIDI sound module is essentially a MIDI instrument without a keyboard, but many current MIDI modules actually contain the equivalent of several MIDI instruments, each of which can be addressed on a different MIDI channel. These are known as multitimbral modules, but the instruments inside are not usually quite as independent as they appear; for example, some parameters may affect all the voices globally, or the sounds may all be mixed to a single stereo pair of audio output sockets. Even so, it is always possible to change the relative volume levels of the different instrument voices and to change their left/right pan positions. Why should you want a multitimbral module, after all, you only have one pair of hands? If you’re playing live, then you probably can’t take full advantage of multitimbral modules (though you could use them to assign different sounds to different regions of your keyboard), but if you want to add a sequencer to your setup to allow you to make multitrack MIDI recordings, just one multitimbral module can provide you with a complete backing band or orchestra, including the drums. Before multitimbral sound modules appeared, you needed a different MIDI instrument for each of the parts you wanted to sequence. On top of that, all MIDI sound modules have what is known as a ‘maximum polyphony’ ­­ the maximum number of notes that they can play at any one time. This being the case, if some of the MIDI channels are already playing very busy parts, you might find that trying to play yet another part on top causes some of the notes to drop out or be cut short. The bottom line here is that the more polyphony you have (64­note polyphony is typically the maximum for modern modules), the better ­­ especially if you’re in the habit of writing complex pieces of music where lots of sustained notes overlap. Drum machines may also be used as MIDI modules, even though they have their own built­in rhythm sequencers. It is possible to access their sounds externally over MIDI, each drum sound being ‘mapped’ to a different note on the keyboard. Some MIDI drum modules, such as the Alesis D4, are specifically designed with no internal sequencing capability, just sounds.

MIDI CONNECTIONS

Most MIDI instruments have three MIDI sockets, labelled In, Out, and Thru, though some older models may not have all three. The master instrument always sends information from its MIDI Out socket, which must be connected to the MIDI In socket of one of the slaves. The MIDI Thru of the slave is then connected to the MIDI In of the next slave and its Thru connected to the In of the next one, and so on… What we end up with is a daisy­chain and, in theory, this can be indefinitely long. Not so in practice, however, because the MIDI signal deteriorates slightly as it passes through each successive instrument. After passing through three or four instruments, the MIDI messages may start to become unreliable, resulting in notes which stick on or refuse to play at all. A better way to interconnect multiple instruments, in anything other than the smallest MIDI system, is to use a so­called MIDI Thru box. This takes the Out from the master keyboard and splits it into several Thru connections, which then feed the individual modules directly. Figure 2 shows the standard method of daisy­ chaining, followed by the same system using a MIDI Thru box instead. In practice, many people use a combination of MIDI Thru boxes and short daisy­chains of instruments. The MIDI Outs of the slave units are normally unused during performance, but they are useful when you want to hook up your keyboard to a MIDI sound editor or librarian program, running on a MIDI­equipped computer.

PROGRAM CHANGE

So far, I’ve explained that MIDI operates on 16 channels and can be used to send note information from a MIDI­compatible master instrument to a MIDI­compatible slave, but there’s a lot more useful information that you can send over MIDI. Today’s synthesizers are programmable, and they have memory banks full of sounds (often called ‘patches’) from which you can choose. MIDI provides direct access for up to 128 patches, sometimes numbered from 0 to 127 and sometimes from 1 to 128. The buttons that are used to select the patches on the master keyboard also enable Program Change information to be transmitted to the slave synthesizer modules, so now we can play the modules remotely and we can select the sound or patch that they will play. These Program Change messages may also be used to switch to different effects patches on a MIDI effects unit that responds to MIDI Program Changes (most units do). In the case of a MIDI instrument that offers more than 128 sounds, the likelihood is that these sounds will be organised into banks, each bank containing no more than 128 sounds. The MIDI protocol now includes the facility to switch from one bank to another, though some older instruments have non­standard bank change systems which are usually explained in their respective operation manuals.

PERFORMANCE CONTROL

A typical MIDI synthesizer has two control wheels mounted to the left of the keyboard, and though these are often assignable to allow them to control various different effects, one is generally used to control pitch bend while the other controls vibrato depth. These controls work by generating electronic signals which, in turn, control the circuitry that creates the sound. And, like note information, control information may also be transmitted down a MIDI cable ­­ simply move the control wheel on the master, and the slave instrument will respond. Time to introduce a possible pitfall. MIDI instruments can often be ‘scaled’ so that, for example, the maximum travel of the pitch bend wheel might cause a pitch shift of as little as one semitone or as much as an octave. It is important to ensure that any instruments likely to play at the same time are set with the same scaling values, especially for pitch bend, otherwise when you try to bend a note on the master keyboard, the sound coming from the master keyboard might go up by a third and the sound from the slave by a fourth ­­ clearly not desirable. Similarly when you’re working with a sequencer, it makes sense to set up your instruments so that they all have the same pitch bend range. Another useful MIDI controller is master volume ­­ most modern instruments respond to it while some older ones do not. On an instrument that transmits master volume information, turning up the master volume slider will send the appropriate control information over MIDI and the receiving slave instrument (providing it understands master volume) will respond to it. In fact, when you get into MIDI you’ll find that there’s a whole list of controllers that can be used to add expression to your performance, including sustain pedals, vector joysticks, sostenuto, and so on. You’ll find a list of the controllers to which your MIDI instruments can respond in their respective manuals, and you’ll notice that the controllers are divided into two types: switch controllers which are either on or off, and continuous controllers which allow something to be varied. For example, a sustain pedal is a simple on/off switch, but a volume control is a continuous controller. Because of the structure of MIDI data, you’ll find that the maximum range of any MIDI parameter or controller is usually from 1 to 128. In other words, a continuous controller really provides you with 128 small (but separate) steps.

MORE ON MIDI

So far I’ve only touched on the basics of MIDI, and much of what MIDI can do has been left unsaid for the time being. Even so, with what you’ve learned so far, you should be able to start putting MIDI into practice. However, there is time to introduce just one more concept, and that’s the idea of MIDI Clock. Some MIDI instruments, like drum machines, have a built­in sequencer which allows drum patterns to be set up and played back at different tempos. Such instruments both send and receive MIDI Clock data, a series of electronic timing markers which go down the MIDI lead along with the other data. Think of it as the electronic equivalent of the sprocket holes at the edge of a cine film and you’ll soon grasp the idea. MIDI Clock is very useful as it allows us to synchronise two or more MIDI devices together. For example, a drum machine could be slaved to a second drum machine so that both play together, allowing you to use sounds from both machines. And as we shall see later, MIDI Clock is what allows us to synchronise sequencers and drum machines together or to sync sequencers to tape recorders. Also associated with MIDI synchronisation are commands for starting and stopping things like drum machines and sequencers, and these are known as MIDI Real Time messages (see box). There’ll be more about MIDI Clock when we delve into the basics of sequencing next month.”Like note information, control information may also be transmitted down a MIDI cable ­­ simply move the control wheel on the master, and the slave instrument will respond.””…when you get into MIDI you’ll find that there’s a whole list of controllers that can be used to add expression to your performance…””If you want to add a sequencer to your setup to allow you to make multitrack MIDI recordings, just one multitimbral module can provide you with a complete backing band or orchestra, including the drums.”

MIDI MODES

Most MIDI instruments can be set to receive on any of the 16 MIDI channels, but there is also a setting called Omni mode, which allows the unit to respond to all incoming data, regardless of its channel. Some MIDI equipment, especially older models, tends to default to Omni mode when first switched on. Although this is a trifle tedious, it isn’t really a problem so long as you remember to reset the instrument to the desired MIDI channel before you continue. If the instrument is set to receive on separate MIDI channels, then it is said to be in Omni Off mode. It is also possible to tell an instrument whether to play polyphonically or monophonically, and while polyphonic operation is by far the most common requirement, mono operation has certain advantages, not least for guitar synth users. In Mono mode, a single polyphonic synth can be made to behave as several single­voice synths, each voice being on a different MIDI channel. If you have a MIDI guitar, it makes sense to set up the system so that each guitar string controls its own single synthesizer voice on its own MIDI channel. Not only does that make the note allocation mirror that of the guitar (where each string can only be played monophonically), but it also allows independent amounts of pitch bend to added to each string. The four possible combinations of Omni On/Off and Poly/Mono operation form the four modes of MIDI operation and are defined as follows: 

Mode 1: Omni On/Poly 

Mode 2: Omni On/Mono 

Mode 3: Omni Off/Poly 

Mode 4: Omni Off/Mono 

Most of the time, players using keyboards will use Mode 3, which is the default mode for the majority of MIDI instruments. In Mode 3, the instrument works polyphonically and responds only to notes sent on its chosen MIDI channel (or channels, in the case of a multitimbral instrument). Mode 2 is the least useful mode ­­ indeed, I’ve never met anyone who’s found any use for it at all! Stories abound that it crept into the MIDI specification as the result of a misunderstanding, so if your synth doesn’t support Mode 2, don’t feel you’re missing out.

MIDI JARGON BUSTER

  • MIDI Musical Instrument Digital Interface.
  • MIDI Clock –Series of tempo ­related electronic timing markers embedded in the MIDI data stream. 
  • Note On- MIDI message sent when note is played (key pressed).
  • Note Off -MIDI Message sent when key is released.
  • MIDI Module- Sound generating device with no integral keyboard.
  • Multitimbral Module-MIDI sound source capable of producing several different sounds at the same time, controlled on different MIDI channels.
  • MIDI Channels-The 16 channels over which MIDI information can be sent.
  • MIDI Mode- MIDI information can be interpreted by the receiving MIDI instrument in a number of ways, the most common being polyphonically on a single MIDI channel (Poly­/Omni Off mode). Omni mode enables a MIDI instrument to play all incoming data regardless of channel setting.
  • MIDI Program Change- Type of MIDI message used to change sound patches on a remote module or the
    effects patch on a MIDI effects unit.
  • MIDI Controller- MIDI message sent in response to movement of certain physical controls on the master
    keyboard (or other master MIDI instrument).
  • MIDI Thru Box- Device which splits the MIDI Out signal of a master instrument or sequencer to avoid
    daisy ­chaining.
  • MIDI In- Socket used to receive information from a master controller or from the MIDI Thru socket of a
    slave unit
  • MIDI Out- Socket on a master controller or sequencer used to send MIDI information to the slave units
  • MIDI Thru- Socket on a slave unit used to feed the MIDI In socket of the next unit in line.

REAL TIME MESSAGES

Before MIDI arrived on the scene in 1982/83, attempts were made to provide tempo­ related clock systems to allow devices from different manufacturers to be synchronised together, but quite often they used different numbers of clocks per bar which meant some form of convertor box had to be employed. MIDI uses 96 clock pulses per 4­beat bar (or ‘whole note’, as the Americans like to call it) so any piece of MIDI gear that can send or read tempo information will sync to any other. If the tempo of the master machine is speeded up, its MIDI Clock rate speeds up accordingly, so the slave tempo is forced to follow. Even when the master machine is not playing, it is still sending out MIDI Clock data at the current tempo, which means that any connected slave device knows exactly what tempo to take off at when it receives a Start command. The Stop command will cause both the master and slave machines to stop running, and a further command, Continue, allows the machines to continue playing from wherever in the song they were stopped. Start always causes the master and slave to start from the beginning of the song. If you’re wondering how the machines know whether they’re supposed to be the master or a slave, it’s because they can all be switched for internal sync (master) or external MIDI sync (slave) operation. Any machine switched to external MIDI sync can be used as a slave. As with MIDI note information, the MIDI connection runs from the master’s MIDI Out to the slave’s MIDI In.

Basics of USB-MIDI

USB MIDI 2.0 ADOPTED

MIDI 2.0 Progress Continues with Updated USB Specification –  

As computers have become central components in many MIDI systems, USB has become the most widely used protocol for transporting MIDI data. With the introduction of MIDI 2.0, the USB Implementers Forum’s USB MIDI 2.0 working group, headed by members o

 

 

 

USB and MIDI

 

MIDI has stayed relevant for over 30 years by adapting to the different ways that computers send information to and from external devices. MIDI can now be sent over 5 Pin DIN, Serial Ports, USB, Firewire, Ethernet, Bluetooth and more. But currently the most prevalent way to connect to computers, tablets and smartphones is USB. This article will cover the basics of USB-MIDI.

 

Why USB came about

 

In the early 1990’s, there were far too many types of connectors on computers. There were separate serial ports, parallel ports, keyboard and mouse connections, and joystick ports, It was hard for people to tell whether the peripheral they were buying would actually work with their computer.  So Compaq, Intel, Microsoft and NEC ( joined later by Hewlett-Packard, Lucent and Philips) formed the USB Implementers Forum, Inc, a non-profit corporation to publish the specifications and organise further development in USB. Similar to the MIDI Manufacturers Association, the USB-IF makes sure that there is interoperability between USB devices.

 

Goals of USB

 

The USB-IF had some clear goals when first developing the USB specification

  • Standardize connector types: There are now several different types of USB connectors, but they are all standardized by the USB-IF
  • Hot-swappable: USB devices can be safely plugged and unplugged as needed while the computer is running. So there is no need to reboot.
  • Plug and Play: USB devices are divided into functional types (Audio, Image, Human User Interface, Mass Storage) and then operating system software can automatically identify, configure, and load the appropriate device driver when a user connects a USB device.
  • High performance: USB offers low speed (1.5 Mbit/s), full speed (12 Mbit/s) and high speed (up to 480 Mbit/s) transfer rates that can support a variety of USB peripherals. USB 3.0 (SuperSpeed USB) achieves the throughput up to 5.0 Gbit/s.
  • Expandability: Up to 127 different peripheral devices may theoretically be connected to a single bus at one time

 

USB System Architecture

 

The basic USB system architecture is actually pretty simple and consists of the following main components:

  • A Host Computer, Smartphone or Tablet
  • One or more USB Devices
  • A physical bus represented by the USB Cable that links the devices with the host 

The Universal Serial Bus is a host controlled bus. All data transfers are initiated and controlled by the host and USB peripherals are slaves responding to host commands. So for  USB MIDI peripheral devices you need a computer, smartphone or tablet in the system to control and initiate USB communication.

 

USB Device Classes

 

USB devices are defined into specific functional classes, for example image, human interface devices (keyboard, mouse, joystick), mass storage, and audio. The operating system can then know what the devices is designed to do and automatically load what is called a class compliant driver for that type of devices. In 1999, the MIDI specification was developed by the USB-IF in cooperation with the MIDI Manufacturers Association and included in the Audio class of devices.  That is why sometimes when you connect a USB-MIDI peripheral, the OS will display a message that says USB-Audio devices connected.  As far as USB is concerned MIDI is an Audio Class Compliant device.

 

Class Compliant Drivers versus Manufacturer Specific Drivers

 

Class compliant drivers are convenient because you don’t have to download any external software.  But often manufacturer specific drivers provide added functionality. Let’s use Yamaha has an example.  Because data transfer on USb is much faster than 5 pin DIN it is possible to have multiple ports of MIDI (a port is a group of 16 MIDI channels) on a single USB cable. The dedicated Yamaha USB Driver provides for 8 ports of high speed USB, includes the names of all the devices that are compatible with the driver and has some routing capabilities. These features are only available if you download the driver from Yamaha’s website.  Also many audio interfaces are also MIDI interfaces and audio and MIDI travel over the USb cable.  So if you purchase a MIDI or Audio interface you should always check the product manual and manufacturer’s website to see if there is a dedicated USB driver for your product that provides added functionality. Often even if the manufacturer specific driver is available when connected to a device which don’t allow driver downloads into the operating system (for example iOS devices), the product will still work as a class compliant USB device.

 

Types of USB MIDI connectors

 

Over the years, USB has developed and there are now a number of different cable types and USB specifications. Let’s take a look at the different connectors.

 

 

 

 

Originally most desktop and laptops computers had the standard sized Type A USB connector. A standard USB cable has a Type A connector on one end to connect to the host and a Type B connector on the other end to connect to the peripheral device. This is still the most common cable to connect a MIDI instrument to a computer.

 

 

 

 

USB Type A host connector

 

 

 

 

Type B USB peripheral connector

 

 

The Type A connector has a pin that supplies power to external peripherals so you need to be carefully about trying to connect two hosts via a Type A to Type A cable. This can cause serious damage to your gear so consult the manufacturer and manual before attempting this.

 

The Type A connector is for host controllers (computers, smartphones, tablets and some digital musical instruments that act as hosts) and USB hubs. A USB hub is a device that expands a single (USB) port into several so that there are more ports available to connect devices to a host system.USB hubs are often built into equipment such as computers, computer keyboards, monitors, or printers. When a device has many USB ports, they all usually stem from one or two internal USB hubs rather than each port having independent USB circuitry. If you need more USB ports, there are also external hubs that you can buy. You need to check to see if your USB peripherals need to be powered by USB and if they do you may need a powered USB hub.

 

 

 

 

On many digital musical instruments you find two USB connectors – one Type A connector labeled To Device and one Type B labeled To Host .  The To Host is usually used to send MIDI, Audio or both Audio and MIDI to a computer, smartphone or tablet. If your digital music product sends both MIDI and Audio over USB, you will almost certainly need a manufacturer specific driver.

The To Device is usually used for USB Storage devices like Flash Thumb drives, but it can be used for other things depending on what the Host music product supports for device classes.

 

 

 

 

USB A-Type

 

Considered the standard and most common type of connector, A-style connectors are found on the PC or charger side of most cables. This flat, rectangular interface is held in place through friction. Durable enough for continuous connection but easy enough for users to connect and disconnect, this connector type is also available in micro variations.

 

USB B-Type

 

Type-B USBs were traditionally used with printer cables but, they’re now found on many popular models of Android smartphones and external hard drives. These USBs feature a square interface and are available as a Micro-USB B, USB Mini-b (5-pin), and USB Mini-b (4-pin).

 

USB C-Type

 

The newest type of connector on the market, Type-C is a one-size-fits-all solution. Developed to support devices with a smaller, thinner and lighter form factor. Type-C is slim enough for a smartphone or tablet, yet robust enough for a desktop computer. It also has the advantage of a reversible plug orientation and cable direction, eliminating the guesswork about which direction the connection goes.

 

The future of USB Connectivity

 

USB Type-C is designed as a one-size-fits-all solution for data transfer and power supply on any device. Featuring a smaller connector, Type-C fits into one multi-use port to simultaneously charge devices and transfer data and also offers backward compatibility to support previous USB standards (2.0, 3.0, and 3.1).

Type-C is quickly becoming the new standard for operating systems and hardware providers; Intel’s Thunderbolt recently switched to USB Type-C ports while enabling cross compatibility with USB 3.1. The new Apple MacBooks feature a Type-C port.

The USB-IF predicts that by 2019, all laptops, tablets, mobile phones, and other consumer electronics will be equipped with USB Type-C.

In the meantime, if you have a newer computer, you may need an adapter to connect your MIDI gear to your computer.

 

 

 

 


 

 

About MIDI-Part 1:Overview

MIDI (pronounced “mid-e”) is a technology that makes creating, playing, or just learning about music easier and more rewarding. Playing a musical instrument can provide a lifetime of enjoyment and friendship. Whether your goal is to play in a band, or you just want to perform privately in your home, or you want to develop your skills as a music composer or arranger, MIDI can help.

How Does MIDI Work?

There are many different kinds of devices that use MIDI, from cell phones to digital music instruments to personal computers. The one thing all MIDI devices have in common is that they speak the “language” of MIDI. This language describes the process of playing music in much the same manner as sheet music: there are MIDI Messages that describe what notes are to be played and for how long, as well as the tempo, which instruments are to be played, and at what relative volumes.

MIDI is not audio.

MIDI is not audio. So if someone says MIDI sounds bad, they really don’t understand how MIDI works. Imagine if you took sheet music of a work by Beethoven and handed it to someone who can read music, but has never played the violin. Then you  put in their hands a very cheap violin. The music would probably sound bad. Now take that same piece of sheet music and hand it to the first chair of a symphony orchestra playing a Stradivarius and it will sound wonderful. So MIDI depends on the quality of playback device and also also how well the description of the music fits that player.

MIDI is flexible

The fact that MIDI is a descriptive language provides tremendous flexibility.Because MIDI data is only performance instructions and not a digital version of a sound recording, it is actually possible to change the performance, whether that means changing just one note played incorrectly, or changing all of them to perform the song in an entirely new key or at a different tempo, or on different instruments.

MIDI data can be transmitted between MIDI-compatible musical instruments, or stored in a Standard MIDI File for later playback. In either case, the resulting performance will depend on how the receiving device interprets the performance instructions, just as it would in the case of a human performer reading sheet music. The ability to fix, change, add, remove, speed up or slow down any part of a musical performance is exactly why MIDI is so valuable for creating, playing and learning about music.

The Three Parts of MIDI

The original Musical Instrument Digital Interface (MIDI) specification defined a physical connector and message format for connecting devices and controlling them in “real time”. A few years later Standard MIDI Files were developed as a storage format so performance information could be recalled at a later date. The three parts of MIDI are often just referred to as “MIDI “, even though they are distinctly different parts with different characteristics.

1. The MIDI Messages – the software protocol

The MIDI Messages specification (or “MIDI Protocol”) is the most important part of MIDI. The protocol is made up of  the MIDI messages that describe the music. There are note messages that tell the MIDI devices what note to play, there are velocity messages that tell the MIDI device how loud to play the note, there are messages to define how bright, long or short a note will be.There are Program Change messages that tell the MIDI device what instrument to play.So by studying and understanding MIDI messages you can learn how to completely describe a piece of music digitally. Look for information about MIDI messages in the “Control” section of Resources.

2. The physical transports for MIDI

Though originally intended just for use with the MIDI DIN transport as a means to connect two keyboards, MIDI messages are now used inside computers and cell phones to generate music, and transported over any number of professional and consumer interfaces (USB, Bluetooth, FireWire, etc.) to a wide variety of MIDI-equipped devices.

There are many different Cables & Connectors that are used to transport MIDI data between devices. Look for specific information in the “Connect” section of Resources.

MIDI is not slow

The “MIDI DIN” transport causes some confusion because it has specific characteristics which some people associate as characteristics of “MIDI” — forgetting that the MIDI-DIN characteristics go away when using MIDI over other transports (and inside a computer). With computers a High Speed Serial, USB or FireWire connection is more common. USB MIDI is significantly faster than 5 pin DIN. Each transport has its own performance characteristics that might make some difference in specific applications, but in general the transport is the least important part of MIDI, as long as it allows you to connect all the devices you want use!

3. The file formats for MIDI files

The final part of MIDI is made up of the Standard MIDI Files (and variants), which are used to distribute music playable on MIDI players of both the hardware and software variety. All popular computer platforms can play MIDI files (*.mid) and there are thousands of web sites offering files for sale or even for free. Anyone can make a MIDI file using commercial (or free) software that is readily available, and many people do. Whether or not you like a specific MIDI file can depend on how well it was created, and how accurately your synthesizer plays the file… not all synthesizers are the same, and unless yours is similar to that of the file composer, what you hear may not be at all what he or she intended. Look in the “Create”section of Resources for information about how to create and use different MIDI file formats.

Even More MIDI

Many people today see MIDI as a way to accomplish something, rather than as a protocol, cable, or file format. For example, many musicians will say they “use MIDI”, “compose in MIDI” or “create MIDI parts”, which means they are sequencing MIDI events for playback via a synthesizer, rather than recording the audio that the synthesizer creates.

About MIDI-Part 2:MIDI Cables & Connectors

Part 2: MIDI Cables & Connectors

Many different “transports” can be used for MIDI messages. The speed of the transport determines how much MIDI data can be carried, and how quickly it will be received.

5-Pin MIDI DIN

Using a 5-pin “DIN” connector, the MIDI DIN transport was developed back in 1983, so it is slow compared to common high-speed digital transports available today, like USB, FireWire, and Ethernet. But MIDI-DIN is almost always still used on a lot of MIDI-equipped devices because it adequately handles communication speed for one device. Also, if you want to connect one MIDI device to another (without a computer), MIDI cables are usually needed.

USB and FireWire

Computers are most often equipped with USB and possibly FireWire connectors, and these are now the most common means of connecting MIDI devices to computers (using appropriate adapters). Adapters can be as simple as a short cable with USB or FireWire connectors on one end and MIDI DIN connectors on the other, or as complex as a 19 inch rack mountable processor with dozens of MIDI and Audio In and Out ports. The best part is that USB and FireWire are “plug-and-play” interfaces which means they generally configure themselves. In most cases, all you need to do is plug in your USB or FireWire MIDI interface and boot up some MIDI software and off you go.

With USB technology, devices must connect to a host (PC), so it is not possible to connect two USB MIDI devices to each other as it is with two MIDI DIN devices. (This could change sometime in the future with new versions of USB). USB-MIDI devices require a “driver” on the PC that knows how the device sends/receives MIDI messages over USB. Most devices follow a specification (“class”) that was defined by the USB-IF; Windows and Mac PCs already come with “class compliant” drivers for devices that follow the USB-IF MIDI specification. For more details, see the article on USB in the Connection area of Resources. 

Most FireWire MIDI devices also connect directly to a PC with a host device driver, and the host handles communication between FireWire MIDI devices even if they use different drivers. But FireWire also supports “peer-to-peer” connections, so MMA (along with the 1394TA) produced a specification for transport of MIDI over IEEE-1394 (FireWire), which is available for download on this site (and also part of the IEC-61883 international standard).

Ethernet & WiFi (LAN)

Many people have multiple MIDI instruments and one or more computers (or a desktop computer and a mobile device like an iPad), and would like to connect them all over a local area network (LAN). However, Ethernet and WiFi LANs do not always guarantee on-time delivery of MIDI messages, so MMA has been reluctant to endorse LANs as a recommended alternative to MIDI DIN, USB, and FireWire. That said, there are many LAN-based solutions for MIDI, the most popular being the RTP-MIDI specification which was developed at the IETF in cooperation with MMA Members and the MMA Technical Standards Board. In anticipation of increased use of LANs for audio/video in the future, MMA is also working on recommendations for transporting MIDI using new solutions like the IEEE-1722 Transport Protocol for Time-Sensitive Streams.

Bluetooth

Everything is becoming “mobile”, and music creation is no exception. There are hundreds of music-making software applications for tablets and smart phones, many of which are equipped with Bluetooth “LE” (aka “Smart”) wireless connections. Though Bluetooth technology is similar to WiFi in that it can not always guarantee timely delivery of MIDI data, in some devices Bluetooth takes less battery power to operate than WiFi, and in most cases will be less likely to encounter interference from other devices (because Bluetooth is designed for short distance communication). In 2015 the MMA adopted Bluetooth MIDI LE performance and developed a recommended practice (specification) for MIDI over Bluetooth.

Deprecated Solutions

Sound Cards

It used to be that connecting a MIDI device to a computer meant installing a “sound card” or “MIDI interface” in order to have a MIDI DIN connector on the computer. Because of space limitations, most such cards did not have actual 5-Pin DIN connectors on the card, but provided a special cable with 5-Pin DINs (In and Out) on one end (often connected to the “joystick port”). All such cards need “driver” software to make the MIDI connection work, but there are a few standards that companies follow, including “MPU-401” and “SoundBlaster”. Even with those standards, however, making MIDI work could be a major task. Over a number of years the components of the typical sound card and MIDI interface (including the joystick port) became standard on the motherboard of most PCs, but this did not make configuring them any easier.

Serial, Parallel, and Joystick Ports

Before USB and FireWire, personal computers were all generally equipped with serial, parallel, and (possibly) joystick ports, all of which have been used for connecting MIDI-equipped instruments (through special adapters). Though not always faster than MIDI-DIN, these connectors were already available on computers and that made them an economical alternative to add-on cards, with the added benefit that in general they already worked and did not need special configuration. The High Speed Serial Ports such as the “mini-DIN” ports available on early Macintosh computers support communication speeds roughly 20 times faster than MIDI-DIN, making it also possible for companies to develop and market “multiport” MIDI interfaces that allowed connecting multiple MIDI-DINs to one computer. In this manner it became possible to have the computer address many different MIDI-equipped devices at the same time. Recent multi-port MIDI interfaces use even faster USB or FireWire ports to connect to the computer.

About MIDI-Part 4:MIDI Files

Standard MIDI Files (“SMF” or *.mid files)

 

Standard MIDI Files (“SMF” or *.mid files) are a popular source of music on the web, and for musicians performing in clubs who need a little extra accompaniment. The files contain all the MIDI instructions for notes, volumes, sounds, and even effects. The files are loaded into some form of ‘player’ (software or hardware), and the final sound is then produced by a sound-engine that is connected to or that forms part of the player.

One reason for the popularity of MIDI files is that, unlike digital audio files (.wav, .aiff, etc.) or even compact discs or cassettes, a MIDI file does not need to capture and store actual sounds. Instead, the MIDI file can be just a list of events which describe the specific steps that a soundcard or other playback device must take to generate ceratin sounds. This way, MIDI files are very much smaller than digital audio files, and the events are also editable, allowing the music to be rearranged, edited, even composed interactively, if desired.

All popular computer platforms can play MIDI files (*.mid) and there are thousands of web sites offering files for sale or even for free. Anyone can make a MIDI file using commercial (or free) software that is readily available, and many people do, with a wide variety of results.

Whether or not you like a specific MIDI file can depend on how well it was created, and how accurately your synthesizer plays the file… not all synthesizers are the same, and unless yours is similar to that of the file composer, what you hear may not be at all what he or she intended. General MIDI (GM) and GM2 both help address the issue of predictable playback from MIDI Files.

Formats 

The Standard MIDI File format is different from native MIDI protocol, because the events are time-stamped for playback in the proper sequence.

Standard MIDI Files come in two basic varieties: a Type 1 file, and a Type 0 file (a Type 2 was also specified originally but never really caught on, so we won’t spend any time discussing it here). In a Type 1 file individual parts are saved on different tracks within the sequence. In a Type 0 file everything is merged into a single track.  

Making SMFs

Musical performances are not usually created as SMFs; rather a composition is recorded using a sequencer such as Digital Performer, Cubase, Sonar etc. that saves MIDI data in it’s own format. However, most if not all sequencers a ‘Save As’ or ‘Export’ as a Standard MIDI File.

Compositions in SMF format can be created and played back using most DAW software (Cubase, Logic, Sonar, Performer, FL Studio, Ableton Live, GarageBand ( Type 1 SMF), and other MIDI software applications.  Many hardware products (digital pianos, synths and workstations) can also create and playback SMF files. Check the manual of the MIDI products you own to find out about their SMF capabilities. 

Setup Data

An SMF not only contains regular MIDI performance data – Channelized notes, lengths, pitch bend data etc – it also should have data (commonly referred to as a ‘header’) that contains additional set-up data (tempo, instrument selections per Channel, controller settings, etc.) as well as songinformation (copyright notices, composer, etc.).

How good, or true to its originally created state an SMF will sound can depend a lot on the header information. The header can exert control over the mix, effects, and even sound editing parameters in order to minimize inherent differences between one soundset and another. There is no standard set of data that you have to put in a header (indeed such data can also be placed in a spare ‘set-up’ bar in the body of the file itself) but generally speaking the more information you provide for the receiving sound device the more defined – and so, presumably, the more to your tastes – the results will be.

Depending upon the application you are using to create the file in the first place, header information may automatically be saved from within parameters set in the application, or may need to be manually placed in a ‘set-up’ bar before the music data commences.

Information that should be considered (per MIDI Channel) includes:

  • Bank Select (0=GM) / Program Change #
  • Reset All Controllers (not all devices may recognize this command so you may prefer to zero out or reset individual controllers)
  • Initial Volume (CC7) (standard level = 100)
  • Expression (CC11) (initial level set to 127)
  • Hold pedal (0 = off)
  • Pan (Center = 64)
  • Modulation (0)
  • Pitch bend range
  • Reverb (0 = off)
  • Chorus level (0 = off)

All files should also begin with a GM/GS/XG Reset message (if appropriate) and any other System Exclusive data that might be necessary to setup the target synthesizer. If RPNs or more detailed controller messages are being employed in the file these should also be reset or normalized in the header.

If you are inputting header data yourself it is advisable not to clump all such information together but rather space it out in intervals of 5-10 ticks. Certainly if a file is designed to be looped, having too much data play simultaneously will cause most playback devices to ‘choke, ‘ and throw off your timing. 

Download the MIDI 1.0 specification that includes Standard MIDI File details 


...

The Complete MIDI 1.0 Detailed Specification

THE MIDI ASSOCIATION, a global community of people who work, play and create with MIDI and the central repository of information about anything related to MIDI.

About MIDI-Part 3:MIDI Messages

Part 3: MIDI Messages

The MIDI Message specification (or “MIDI Protocol”) is probably the most important part of MIDI.

MIDI is a music description language in digital (binary) form. It was designed for use with keyboard-based musical instruments, so the message structure is oriented to performance events, such as picking a note and then striking it, or setting typical parameters available on electronic keyboards. For example, to sound a note in MIDI you send a “Note On” message, and then assign that note a “velocity”, which determines how loud it plays relative to other notes. You can also adjust the overall loudness of all the notes with a Channel Volume” message. Other MIDI messages include selecting which instrument sounds to use, stereo panning, and more.

The first specification (1983) did not define every possible “word” that can be spoken in MIDI , nor did it define every musical instruction that might be desired in an electronic performance. So over the past 20 or more years, companies have enhanced the original MIDI specification by defining additional performance control messages, and creating companion specifications which include:

  • MIDI Machine Control
  • MIDI Show Control
  • MIDI Time Code
  • General MIDI
  • Downloadable Sounds
  • Scalable Polyphony MIDI

Alternate Applications 

MIDI Machine Control and MIDI Show Control are interesting extensions because instead of addressing musical instruments they address studio recording equipment (tape decks etc) and theatrical control (lights, smoke machines, etc.).

MIDI is also being used for control of devices where standard messages have not been defined by MMA, such as with audio mixing console automation.

Different Kinds of MIDI Messages

A MIDI message is made up of an eight-bit status byte which is generally followed by one or two data bytes. There are a number of different types of MIDI messages. At the highest level, MIDI messages are classified as being either Channel Messages or System Messages. Channel messages are those which apply to a specific Channel, and the Channel number is included in the status byte for these messages. System messages are not Channel specific, and no Channel number is indicated in their status bytes.

Channel Messages may be further classified as being either Channel Voice Messages, or Mode Messages. Channel Voice Messages carry musical performance data, and these messages comprise most of the traffic in a typical MIDI data stream. Channel Mode messages affect the way a receiving instrument will respond to the Channel Voice messages.

Channel Voice Messages

Channel Voice Messages are used to send musical performance information. The messages in this category are the Note On, Note Off, Polyphonic Key Pressure, Channel Pressure, Pitch Bend Change, Program Change, and the Control Change messages.

Note On / Note Off / Velocity

In MIDI systems, the activation of a particular note and the release of the same note are considered as two separate events. When a key is pressed on a MIDI keyboard instrument or MIDI keyboard controller, the keyboard sends a Note On message on the MIDI OUT port. The keyboard may be set to transmit on any one of the sixteen logical MIDI channels, and the status byte for the Note On message will indicate the selected Channel number. The Note On status byte is followed by two data bytes, which specify key number (indicating which key was pressed) and velocity (how hard the key was pressed).

The key number is used in the receiving synthesizer to select which note should be played, and the velocity is normally used to control the amplitude of the note. When the key is released, the keyboard instrument or controller will send a Note Off message. The Note Off message also includes data bytes for the key number and for the velocity with which the key was released. The Note Off velocity information is normally ignored.

Aftertouch

Some MIDI keyboard instruments have the ability to sense the amount of pressure which is being applied to the keys while they are depressed. This pressure information, commonly called “aftertouch”, may be used to control some aspects of the sound produced by the synthesizer (vibrato, for example). If the keyboard has a pressure sensor for each key, then the resulting “polyphonic aftertouch” information would be sent in the form of Polyphonic Key Pressure messages. These messages include separate data bytes for key number and pressure amount. It is currently more common for keyboard instruments to sense only a single pressure level for the entire keyboard. This “Channel aftertouch” information is sent using the Channel Pressure message, which needs only one data byte to specify the pressure value.

Pitch Bend

The Pitch Bend Change message is normally sent from a keyboard instrument in response to changes in position of the pitch bend wheel. The pitch bend information is used to modify the pitch of sounds being played on a given Channel. The Pitch Bend message includes two data bytes to specify the pitch bend value. Two bytes are required to allow fine enough resolution to make pitch changes resulting from movement of the pitch bend wheel seem to occur in a continuous manner rather than in steps.

Program Change

The Program Change message is used to specify the type of instrument which should be used to play sounds on a given Channel. This message needs only one data byte which specifies the new program number.

Control Change

MIDI Control Change messages are used to control a wide variety of functions in a synthesizer. Control Change messages, like other MIDI Channel messages, should only affect the Channel number indicated in the status byte. The Control Change status byte is followed by one data byte indicating the “controller number”, and a second byte which specifies the “control value”. The controller number identifies which function of the synthesizer is to be controlled by the message. A complete list of assigned controllers is found in the MIDI 1.0 Detailed Specification.

– Bank Select

Controller number zero (with 32 as the LSB) is defined as the bank select. The bank select function is used in some synthesizers in conjunction with the MIDI Program Change message to expand the number of different instrument sounds which may be specified (the Program Change message alone allows selection of one of 128 possible program numbers). The additional sounds are selected by preceding the Program Change message with a Control Change message which specifies a new value for Controller zero and Controller 32, allowing 16,384 banks of 128 sound each.

Since the MIDI specification does not describe the manner in which a synthesizer’s banks are to be mapped to Bank Select messages, there is no standard way for a Bank Select message to select a specific synthesizer bank. Some manufacturers, such as Roland (with “GS”) and Yamaha (with “XG”) , have adopted their own practices to assure some standardization within their own product lines.

– RPN / NRPN

Controller number 6 (Data Entry), in conjunction with Controller numbers 96 (Data Increment), 97 (Data Decrement), 98 (Non-Registered Parameter Number LSB), 99 (Non-Registered Parameter Number MSB), 100 (Registered Parameter Number LSB), and 101 (Registered Parameter Number MSB), extend the number of controllers available via MIDI. Parameter data is transferred by first selecting the parameter number to be edited using controllers 98 and 99 or 100 and 101, and then adjusting the data value for that parameter using controller number 6, 96, or 97.

RPN and NRPN are typically used to send parameter data to a synthesizer in order to edit sound patches or other data. Registered parameters are those which have been assigned some particular function by the MIDI Manufacturers Association (MMA) and the Japan MIDI Standards Committee (JMSC). For example, there are Registered Parameter numbers assigned to control pitch bend sensitivity and master tuning for a synthesizer. Non-Registered parameters have not been assigned specific functions, and may be used for different functions by different manufacturers. Here again, Roland and Yamaha, among others, have adopted their own practices to assure some standardization.

Channel Mode Messages

Channel Mode messages (MIDI controller numbers 121 through 127) affect the way a synthesizer responds to MIDI data. Controller number 121 is used to reset all controllers. Controller number 122 is used to enable or disable Local Control (In a MIDI synthesizer which has it’s own keyboard, the functions of the keyboard controller and the synthesizer can be isolated by turning Local Control off). Controller numbers 124 through 127 are used to select between Omni Mode On or Off, and to select between the Mono Mode or Poly Mode of operation.

When Omni mode is On, the synthesizer will respond to incoming MIDI data on all channels. When Omni mode is Off, the synthesizer will only respond to MIDI messages on one Channel. When Poly mode is selected, incoming Note On messages are played polyphonically. This means that when multiple Note On messages are received, each note is assigned its own voice (subject to the number of voices available in the synthesizer). The result is that multiple notes are played at the same time. When Mono mode is selected, a single voice is assigned per MIDI Channel. This means that only one note can be played on a given Channel at a given time.

Most modern MIDI synthesizers will default to Omni On/Poly mode of operation. In this mode, the synthesizer will play note messages received on any MIDI Channel, and notes received on each Channel are played polyphonically. In the Omni Off/Poly mode of operation, the synthesizer will receive on a single Channel and play the notes received on this Channel polyphonically. This mode could be useful when several synthesizers are daisy-chained using MIDI THRU. In this case each synthesizer in the chain can be set to play one part (the MIDI data on one Channel), and ignore the information related to the other parts.

Note that a MIDI instrument has one MIDI Channel which is designated as its “Basic Channel”. The Basic Channel assignment may be hard-wired, or it may be selectable. Mode messages can only be received by an instrument on the Basic Channel.

System Messages

MIDI System Messages are classified as being System Common Messages, System Real Time Messages, or System Exclusive Messages. System Common messages are intended for all receivers in the system. System Real Time messages are used for synchronization between clock-based MIDI components. System Exclusive messages include a Manufacturer’s Identification (ID) code, and are used to transfer any number of data bytes in a format specified by the referenced manufacturer.

System Common Messages

The System Common Messages which are currently defined include MTC Quarter Frame, Song Select, Song Position Pointer, Tune Request, and End Of Exclusive (EOX). The MTC Quarter Frame message is part of the MIDI Time Code information used for synchronization of MIDI equipment and other equipment, such as audio or video tape machines.

The Song Select message is used with MIDI equipment, such as sequencers or drum machines, which can store and recall a number of different songs. The Song Position Pointer is used to set a sequencer to start playback of a song at some point other than at the beginning. The Song Position Pointer value is related to the number of MIDI clocks which would have elapsed between the beginning of the song and the desired point in the song. This message can only be used with equipment which recognizes MIDI System Real Time Messages (MIDI Sync).

The Tune Request message is generally used to request an analog synthesizer to retune its’ internal oscillators. This message is generally not needed with digital synthesizers.

The EOX message is used to flag the end of a System Exclusive message, which can include a variable number of data bytes.

System Real Time Messages

The MIDI System Real Time messages are used to synchronize all of the MIDI clock-based equipment within a system, such as sequencers and drum machines. Most of the System Real Time messages are normally ignored by keyboard instruments and synthesizers. To help ensure accurate timing, System Real Time messages are given priority over other messages, and these single-byte messages may occur anywhere in the data stream (a Real Time message may appear between the status byte and data byte of some other MIDI message).

The System Real Time messages are the Timing Clock, Start, Continue, Stop, Active Sensing, and the System Reset message. The Timing Clock message is the master clock which sets the tempo for playback of a sequence. The Timing Clock message is sent 24 times per quarter note. The Start, Continue, and Stop messages are used to control playback of the sequence.

The Active Sensing signal is used to help eliminate “stuck notes” which may occur if a MIDI cable is disconnected during playback of a MIDI sequence. Without Active Sensing, if a cable is disconnected during playback, then some notes may be left playing indefinitely because they have been activated by a Note On message, but the corresponding Note Off message will never be received.

The System Reset message, as the name implies, is used to reset and initialize any equipment which receives the message. This message is generally not sent automatically by transmitting devices, and must be initiated manually by a user.

System Exclusive Messages

System Exclusive messages may be used to send data such as patch parameters or sample data between MIDI devices. Manufacturers of MIDI equipment may define their own formats for System Exclusive data. Manufacturers are granted unique identification (ID) numbers by the MMA or the JMSC, and the manufacturer ID number is included as part of the System Exclusive message. The manufacturers ID is followed by any number of data bytes, and the data transmission is terminated with the EOX message. Manufacturers are required to publish the details of their System Exclusive data formats, and other manufacturers may freely utilize these formats, provided that they do not alter or utilize the format in a way which conflicts with the original manufacturers specifications.

Certain System Exclusive ID numbers are reserved for special protocols. Among these are the MIDI Sample Dump Standard, which is a System Exclusive data format defined in the MIDI specification for the transmission of sample data between MIDI devices, as well as MIDI Show Control and MIDI Machine Control.

Why MIDI Matters

MIDI is like air, it’s all around you, most of the time you can take it for granted, but if you are a digital musician you probably couldn’t live with out it. MIDI is inside musical instruments, computers, tablets, smart phones, stage lighting, audio mixers, and many other products from well-known international companies including Apple, Gibson, Google, Korg, Microsoft, Roland, Yamaha, and hundreds more. By the fall of 2015, it’s projected that they will be around 2.6 billion devices on the planet that are MIDI enabled!

MIDI is almost always an integral part of the modern recording process for popular music and the music that is made for films, TV, and even video games. MIDI-equipped electronic keyboards are connected to computers and mobile devices running Digital Audio workstation software that record both MIDI and Audio to produce music. With advances in computing power making digital software based instruments sound ever more realistic, the music you hear on TV, in radio ads and in movies is often recordings of MIDI instruments (both hardware instruments and virtual “soft synth” software rather than of dozens of acoustic instruments. Even the huge orchestral score playing behind that big-screen block buster almost always starts off as a MIDI mockup arrangement and then the composer uses MIDI to transcribe the parts into music notation.

More and more DJ gear has MIDI integrated into it and the last few years has seen a dramatic rise in the number of unique DJ MIDI controllers and controllerists- artists like Moldover that use MIDI to manipulate sound and loops via MIDI controllers.

Types of MIDI

MIDI technology has been around for more than 30 years, and first appeared in the form of a special 5-Pin DIN connector on the back of a keyboard that is used to connect to another keyboard or to a computer. Though 5-pin DIN connections are still used for making connections between standalone hardware digital instruments, over the years as computer technology had developed and advanced so has MIDI.

Your computer, tablet, or smart phone probably has MIDI capabilities and starting in 2015, the web browser you are using to view this page may be capable of making music with MIDI. Google’s Chrome browser (and others to follow) can access local MIDI devices (hardware and software synthesizers, external keyboards, etc.) and use them for producing music and/or for controlling objects in the browser (such as a browser-based synthesizer).Web app developers are rapidly adding

MIDI support to existing apps and creating new apps that are Web-MIDI enabled.

If you own an iPad or iPhone you probably know that there are hundreds of programs you can get for making music (even on-the-go) that use Apple’s CoreMIDI technology to connect to MIDI devices.CoreMIDI also allows an iPad/iPhone to send/receive MIDI messages via Wi-Fi allowing the iPad/iPhone to control and/or play sounds and sequences on the Mac, and vice-versa.CoreMIDI also supports MIDI connections via USB, Ethernet, and FireWire (where equipped). Windows users can also connect to Macs and iOS devices via a third party RTP-MIDI.

With Android M, Google has added robust MIDI support for the billions of Android devices that are available making MIDI one of the most ubiquitous technologies on the planet.

Of course, MIDI is not just for keyboards… other MIDI-equipped musical instruments include digital drums, guitars, wind instruments, and more. For electronic dance music (EDM) and DJs, there are specialized controllers that use MIDI to trigger beats and loops, and to control lighting. And new kinds of digital musical instruments and controllers are being invented all the time, all of which integrate perfectly with existing instruments and devices because of MIDI.

Besides music creation, MIDI has some other interesting and popular uses. MIDI Show Control is a set of MIDI messages used for controlling lights and rides at theme parks as well as for operating themed events such as are found outside many Las Vegas casinos. And MIDI Machine Control provides remote transport control for many kinds of audio/video recording devices. Moreover, because MIDI is widely available and free to use, many people are able to develop unique DYI products using MIDI to control and/or generate sound… in all kinds of shapes and forms.

If you’d like to learn more about the amazingly diverse world of MIDI, this is the place as the site has videos, tutorials, forums and stories about MIDI artists.So join the community of people who make music and art with MIDI and learn how to get the most out of this incredibly flexible technology. 

Craig Anderton’s Brief History Of MIDI

The MIDI specification first saw the light of day at the 1981 AES, when Dave Smith of Sequential Circuits presented a paper on the “Universal Synthesizer Interface.” It was co-developed with other companies (an effort driven principally by Roland’s Ikutaro Kakehashi, a true visionary of this industry), and made its prime time debut at the 1983 Los Angeles NAMM show, where a Sequential Circuits Prophet-600 talked to a Roland keyboard over a small, 5-pin cable. I saw Dave Smith walking around the show and asked him about it. “It worked!” he said, clearly elated—but I think I detected some surprise in there as well.

The Prophet 600 and the Jupiter 6 at the 1983 Winter NAMM show

“It” was the Musical Instrument Digital Interface, known as MIDI. Back in those days, polyphonic synthesizers cost thousands of dollars (and “polyphonic” meant 8 voices, if you were lucky and of course, wealthy). The hot computer was a Commodore-64, with a whopping 64 kilobytes of memory—unheard of in a consumer machine (although a few years before, an upstart recording engineer named Roger Nichols was stuffing 1MB memory boards in a CompuPro S-100 computer to sample drum sounds). The cute little Macintosh hadn’t made its debut, and as impossible as it may seem today, the PC was a second-class citizen, licking its wounds after the disastrous introduction of IBM’s PCjr.

Tom Oberheim had introduced his brilliant System, which allowed a drum machine, sequencer, and synthesizer to talk together over a fast parallel bus. Tom feared that MIDI would be too slow. And I remember talking about MIDI at a Chinese restaurant with Dave Rossum of E-mu systems, who said “Why not just use Ethernet? It’s fast, it exists, and it’s only about $10 to implement.”

But Dave Smith had something else in mind: An interface so simple, inexpensive, and foolproof to implement that no manufacturer could refuse. Its virtues would be low cost, adequate performance, and ubiquity in not just the pro market, but the consumer one as well.

Bingo.

But it didn’t look like success was assured at the time; MIDI was derided by many pros who felt it was too slow, too limited, and just a passing fancy. 30 years later, though, MIDI has gone far beyond what anyone had envisioned, particularly with respect to the studio. No one foresaw MIDI being part of just about every computer (e.g., the General MIDI instrument sets). This trend actually originated on the Atari ST—the first computer with built-in MIDI ports as a standard item (see “Background: When Amy Met MIDI” toward the end of this article).

Evolution of a spec

Oddly, the MIDI spec officially remains at version 1.0, despite significant enhancements over the years: the Standard MIDI File format, MIDI Show Control (which runs the lights and other effects at Broadway shows like Miss Saigon and Tommy), MIDI Time Code to allow MIDI data to be time-stamped with SMPTE timing information, MIDI Machine Control for integration with studio gear, microtonal tuning standards, and a lot more. And the activity continues, as issues arise such as how best to transfer MIDI over USB, with smart phones, and over wireless.

The MIDI Manufacturers Association

The guardian of the spec, the MIDI Manufacturers Association (MMA), has stayed a steady course over the past several decades, holding together a coalition of mostly competing manufacturers with a degree of success that most organizations would find impossible to pull off. The early days of MIDI were a miracle: in an industry where trade secrets are jealously guarded, manufacturers who were intense rivals came together because they realized that if MIDI was successful, it would drive the industry to greater success. And they were right. The MMA has also helped educate users about MIDI, through books and online materials such as “An Introduction to MIDI.”

I had an assignment at the time from a computer magazine to write a story about MIDI. After turning it in, I received a call from the editor. He said the article was okay, but it seemed awfully partial to MIDI, and was unfair because it didn’t give equal time to competing protocols. I tried to explain that there were no competing protocols; even companies that had other systems, like Oberheim and Roland, dropped them in favor of MIDI. The poor editor had a really hard time wrapping his head around the concept of an entire industry willingly adopting a single specification. “But surely there must be alternatives.” All I could do was keep replying, “No, MIDI is it.” Even when we got off the phone, I’m convinced he was sure I was holding back information on MIDI’s competition.

MIDI HERE, MIDI THERE, MIDI EVERYWHERE

Now MIDI is everywhere. It’s on the least expensive home keyboards, and the most sophisticated studio gear. It’s a part of signal processors, guitars, keyboards, lighting rigs, smoke machines, audio interfaces…you name it. It has gone way beyond its original idea of allowing a separation of controller and sound generator, so people didn’t have to buy a keyboard every time they wanted a different sound.

SO WHERE’S IT GOING?

“Always in motion, the future…” Well, Yoda does have a point. But the key point about MIDI is that it’s a hardware/software protocol, not just one or the other. Already, the two occasionally take separate vacations. The MIDI data in your DAW that drives a soft synth doesn’t go through an opto-isolators or cables, but flies around inside your computer.

One reason why MIDI has lasted so long is because it’s a language that expresses musical parameters, and these haven’t changed much in several centuries. Notes are still notes, tempo is still tempo, and music continues to have dynamics. Songs start and end, and instruments use vibrato. As long as music is made the way it’s being made, the MIDI “language” will remain relevant, regardless of the “container” used to carry that data. However, MIDI is not resting on its laurels, and neither is the MMA—you can find out what they’re working on for the future here.

Background: When Amy Met MIDI

After MIDI took off, many people credited Atari with amazing foresight for making MIDI ports standard on their ST series of computers. But the inclusion of MIDI was actually a matter of practicality. Commodore was riding high with the C-64, in large part because of the SID (Sound Interface Device) custom IC, a very advanced audio chip for its time. (Incidentally, Bob Yannes, one of Ensoniq’s founders and also the driving force behind the Mirage sampler, played the dominant role in SID’s development.)

Atari knew that if it wanted to encroach on Commodore’s turf, they needed something better than SID. They designed an extremely ambitious sound chip, code-named Amy, that was supposed to be a “Commodore killer.” But Amy was a temperamental girl, and Atari was never able to get good enough yields to manufacturer the chips economically.

An engineer suggested putting a MIDI port on the machine, so it could drive an external sound generator; then they wouldn’t have to worry about an onboard sound chip. Although this solved the immediate Amy problem, it also turned out to be a fortuitous decision: Atari dominated the European music-making market for years, and a significant chunk of the US market as well. To this day, a hardy band of musicians still use their aging ST and TT series Atari computers because of the exceptionally tight MIDI timing – a result of integrating MIDI into the core of the operating system.