fbpx
Skip to main content

LASER Mammoth puts SysEx storage in your browser

They thought it was abandoned long ago and never would return. They were wrong. The ancient knowledge of the MIDI elders silently faded away and left them clueless and unprepared. How on earth should they have known how to deal with MIDI SysEx programs? In a world where all SysEx apps fail, one app must resist, one app must prevail: F0F7 – Codename LASER Mammoth – Midi SysEx Librarian F0F7 – Codename LASER Mammoth let’s you reorder, rename, delete and share your hardware synthesizer programs.

It’s Sysex in the browser.
It’s the future!
Make your MIDI synth awesome again.

by LASER Mammoth





...

F0F7 – LASER Mammoth

Sysex Librarian for your Synthesizer. Reorder, rename, delete and share your sound programs.

The Cloud Piano by David Bowen

What does the weather sound like?  

David Bowen is an artist who uses data.  He has used data to point an installation of 50 twigs in unison towards the largest piece of space junk currently overhead. He used data to allow the flight of flies to control, aim and shot a revolver. He is not a musician, yet he used MIDI Manufacturer Association member Ableton’s Max MSP software to create a Cloud Piano, a piano played by clouds.

in this installation, Bowen used Max MSP and mapped Video to MIDI so that whenever a cloud passes over a note on the keyboard, the MIDI controller robotic arms presses the keys on an acoustic piano for the duration the clouds are over them. If there’s more white than blue detected, the key is pressed.

He went further and mapped multiple different video attributes to MIDI parameters. Tempo and rhythm are naturally related to how quickly the clouds are moving. Dynamics are controlled by how brightness, the more white in the video, the louder the sounds are played by generating higher MIDI velocities. Even the overall feeling of the music is based on the video input.  Clouds moving in one direction are processed to have a major feel, Clouds moving the in opposite direction are processed with more minor tonalities. 

Of course, this installation wouldn’t work well in Southern California.  There would be too many days when there was no sound at all because there were no clouds in the sky.  It might not work well in Seattle in the winter when skies are overcast for months on end. But we never cease to be amazed at the creative ways artist come up with to use MIDI. 

Here is a Youtube video from Great Big Story “That’s Amazing” about how the installation was put together. 

Yamaha Soundmondo-Social Sound Sharing using Web MIDI

What is Social Sound Sharing?

Yamaha originally launched the Soundmondo website and mobile app in 2015 for the reface line of keyboards.  It was one of the first major website to utilize Web MIDI. 

Connect your reface keyboard to your computer, iPAD or phone, launch Chrome as your browser and you can browse sounds shared by other reface owners, You can create and share  your sounds with people around the world. 

There are over 20,000 free reface sounds available online. 

“Soundmondo is to sound what photo-sharing networks are to images.It’s a great way to share your sound experiences and get inspiration from others.”

by Nate Tschetter, marketing manager, Synthesizers, Yamaha Corporation of America.

Yamaha has since expanded SoundMondo to include other Yamaha keyboards including the Montage. MODX and CP88/73 stage pianos. 

So exactly how does social sound sharing work?  Well, it’s actually pretty simple. You select your instrument and then you can browse by tags so for example all the sounds that have the tags 2000s, EDM and Piano. 

Select the sound and it is sent from the Soundmondo server to your browser and from your browser to your keyboard where you can play. If the synth or stage piano can store sounds, you can store the sound locally on your keyboard.  Using the SoundMondo iOS app, you can create set lists and organize your sounds for live performance. 

When Yamaha launched Soundmondo compatibility for Montage they produced 400 MONTAGE Performances, including content from the original DX ROM Cartridges, special content from Yamaha Music Europe and 16 original Performances from legendary synthesizer sound designer Richard Devine. 

You can see Richard’s performance using the Montage and Richard’s modular setup at Super Booth 2018.

data-href=”https://www.facebook.com/yamahasynths/videos/1467462910048911/”

data-width=”auto”
data-allowfullscreen=”true”
>

 Richard will be at Super Booth again this year creating amazing sounds with  the new Yamaha MODX. 


Telemidi – Creating music over The Internet in real-time

What is Telemidi?

A system of connecting two DAW environments over the internet, to achieve real-time musical `jamming’.
The product of Masters research by Matt Bray.


“…a musician’s behaviour at one location will be occurring at the other location in a near synchronous manner, and vice versa, thus allowing for a `jam’ like atmosphere to be mutually shared.”

Matt Bray (Telemidi creator)

Telemidi is an approach to Networked Music Performance (NMP) that enables musicians to co-create music in real-time by simultaneously exchanging MIDI data over The Internet.  Computer networking brings with it the factor of latency (a delay of data transfer), the prevalent obstacle within NMP‘s, especially when attempting to match the interaction of traditional performance ensembles.  Telemidi accommodates for latency via the use of numerous Latency Accepting Solutions (LAS – identified below) embedded within two linked DAW environments, to equip performers with the ability to interact in a dynamic, interactive and ongoing musical process (jamming).  This is achieved in part by employing RTP (Real Time Protocol) MIDI data transfer systems to deliver performance and control information over The Internet from one IP address to another in a direct P2P (peer to peer) fashion.  Once arriving at a given IP address, MIDI data is then routed into the complex DAW environment to control any number of devices, surfaces, commands and performance mechanisms.  Essentially, a musician’s behaviour at one location will be occurring at the other location in a near synchronous manner, and vice versa, thus allowing for a `jam’ like atmosphere to be mutually shared.  As seen in the video listed below, this infrastructure can be applied to generate all manner of musical actions and genres, whereby participants readily build and exchange musical ideas to support improvising and composing (`Comprovising’).  Telemidi is a true Telematic performance system. 


What is Telematic Performance?

Telematic music performance is a branch of Network Music Performance (NMP) and is a rapidly evolving, exciting field that brings multiple musicians and technologies into the same virtual space. Telematic Performance is the transfer of data and performance information over significant distances, achieved by the explicit use of technology. The more effective the transfer the greater the sense of Telepresence, the ability of a performer to “be” in the space of another performer.  Telematic performances first appeared when Wide Area Networking (WAN) options presented themselves for networked music ensembles via technologies such ISDN telephony, and options increased alongside the explosion of computer processing and networking developments that gave rise to The Internet.  Unfortunately in this global WAN environment, latency has stubbornly remained as a constant and seemingly unavoidable obstruction to real-time ensemble performance.

Telematic performance has been thoroughly explored by countless academic, commercial and hobby entities over the last four decades with limited successes. The musical performances have taken many forms throughout the exponential development of computing technologies, yet have been more-or-less restricted by latency at every turn.  For example, there is the inherent latency of a CPU within any given DAW, the additional processing loads of soft/hardware devices, the size and number of data packages generated in a performance, and the delivery of this data over The Internet which in turn presents issues regarding available bandwidth, data queuing, WiFi strength etc.. This is but one side of the engagement as we also have the DAW requirements of the reciprocating location, and of course the need for synchronous interplay between the two. Real-time NMPs suffer at the whim of network jitter, data delays and DAW operations.


How Telemidi Works

Telemidi works by exchanging MIDI data in a duplex fashion between the IP addresses of two performers, each of whom are running near-identical soft/hardware DAW environments.  A dovetailed MIDI channel allocation caters for their respective actions while avoiding feedback loops, in a system with the potential to deliver performance information to and from each location in near real-time (10-30ms).

To achieve this musical performance over The Internet, the Telemidi process employed:

1 – Hardware – a combination of control devices

2 – Softwaretwo near-identical Ableton Live sets

3Latency Accepting Solutions (LAS) – ten examples

4 – RTP MIDI facilitating the delivery of MIDI data to a WAN.  

Click on the tabs below for a summary of items used at each node location during the research stage of the Telemidi research (for more information and to download the Masters thesis go to www.telemidi.org): 

 Below is a list of hardware used at each location in the Telemidi research:

Lap-top Computers:  + Mac and Windows computers used, demonstrating Telemidi accessibility.


Novation SL Mk II

Novation SL Mk II MIDI controller keyboard


+ High capacity for customised MIDI routing (both control and performance data)

+ Traditional musical interface (keyboard)


Novation LaunchPad Pro

Novation LaunchPad Pro


+ Native integration with Ableton Live

+ Contemporary `Grid-based’ composition process 

 Software

 LAS

Ableton Live 

Near-identical Live sets (duplex architecture)
7 pre-composed songs (each split into four sections, A, B, C & D)
54 additional percussion loop patterns
12 x Synth Instruments (Native and 3rdparty)
Synths: 4 each of Bass/Harmony/Lead
16 DSP effects processers (with 2 or more mapped parameters)
286 interleaved MIDI mappings within each Live set
13 of 16 MIDI Channels used for shared performance and control data
Tempo variation control
Volume & start/stop control for each voice (Bass, Harmony & Melody)
Record and Loop capacity for each voice (Bass, Harmony & Melody)

LATENCY ACCEPTING SOLUTIONS (LAS):

The following processes adapt to and overcoming (cumulatively) the obstacle of latency.  They are ranked in order of efficiency from 1 (most efficient) to 10 (least efficient).

LATENCY ACCEPTING SOLUTION JUSTIFICATION
1 – One Bar Quantisation All pre-composed, percussive and recorded loops are set to trigger upon a one bar quantization routine, allowing time (2000ms @ 120bpm) to accommodate for network latency between song structure changes (most commonly occurring on a 4 to 8 bar basis).
2 – P2P (Peer ) Network Connection: Direct delivery of MIDI data from one IP address to the other. A simple direct delivery. No third party `browser-based’ servers used to calibrate message timing.
3 – Master Slave Relationship:  One node (Alpha) was allocated the role of Master and the other (Beta) the role of slave, allowing for consistent, shared tempo and a self-correcting tempo alignment following any network interference.
4 – Pulse-based music (EDM) as chosen genre for performance:

A genre without reliance on a strict scored format, rather a simple and repetitive pulse.
5 – Floating Progression (manner of Comprovising ideas) Each performer initiates an idea or motif, the other responds accordingly and vice-versa (jamming), any artefacts of latency only play into this process.
6 – 16thNote Record Quantize

Inbuilt Ableton function ensuring any recorded notes quantized to the grid.
7 – MIDI Quantize

3rdparty Max4Live device (16th note) puts incoming WAN MIDI onto the grid of the receiving DAW.
8 – Manual Incremental Tempo Decrease In the event of critical latency interference, tempo can be reduced incrementally, thus extending the time between each new bar and granting time for the clearance of latency issues.
9 – Kick drum (bar length loops) During a period of critical latency interference, a single bar loop of ¼ note kick drum events is triggered to maintain the “genre”.
10 – Stop Buttons During any period of critical latency interference, each voice (beats, percussion, bass, harmony or melody) can be stopped individually to reduce the musical texture, or to stop harmonic dissonance and stuck notes.

RTP MIDI

+ MacOS – AppleMIDI accessed through `Audio MIDI Setup’

+ Windows – rtpMIDI software used (created by Tobias Erichsen)

Success of Performance

Two performances were undertaken in the Telemidi research, the first with each performer 7.5km (4.6 mi) apart, and the second 2,730km (1,696 mi) apart.  Both were recorded and then analysed in detail (see video below), whereby aspects of performance parameters and methods were identified alongside several fundamental principles of Telematic performance.  A stream of audio is generated from each node and each has been analysed in the video to identify the interplay between the two musicians, highlighting any variations in the music created and to recognize artefacts of network performance.  It was noted that the music generated at each node was strikingly similar, although subtle variations in the rhythmic phrasing of bass, harmony and melody were common.

The Telemidi system ably accommodates all but the most obtrusive latency yet provides each musician with the capacity to co-create and Comprovise music in real-time across significant geographic distances.  These performances showed constant interplay and the exchange of musical ideas, as can be seen in the 16 minute analysis video below, leaving the door open for many exciting possibilities in the future.


16min Video Analysis


Future Plans

The principles of Telemidi were the focus of Matt Bray in his 2017 Masters research.  Now the Telemidi process has been proven to function, the landscape is open to allow for musicians to create and interact with each other in real-time scenarios regardless of their geographic locations.

The next steps are to:

+ Recruit keen MIDI-philes from around the globe to share and exchange knowledge in regards to the potentials of the Telemidi process (if this is you, please visit www.telemidi.org and leave a message)

+ Identify the most stable, low latency connections to The Internet available, to begin test performances across greater geographic regions

+ Refine and curate the infrastructure to suit various genres (from EDM to contemporary, also including live vocalists/musicians at each location)

+ Produce and promote simultaneous live performance events in capital cities, first nationally (Australia) and then internationally.

If you are at all interested in contributing to, or participating in the Telemidi process, please contact me, Matt Bray at www.telemidi.org, I’d love to hear from you and see what possibilities are achievable. 

Thanks for checking out Telemidi!!

Matt Bray


Google Releases Song Maker with Web MIDI

Google Creative Lab, Use All Five, and Yotam Mann launched a new browser-based music sequencer called Song Maker. 

It’s a classic grid style sequencer and allows anyone to easily create simple grooves on the web.  You can even connect your MIDI keyboard or other controllers to input notes via Web MIDI. 

When you’re finished you can save your Song Maker groove, share it on Facebook and Twitter or even get the embed code to embed Song Maker on your website.  Check it out below!


.


Creating a MIDI Collaboration App using Express.js & Socket.io

 Want to make music with your friends on Internet. This tutorial  from Andrew Bales will show you how to build an app that allows you to collaborate with friends to make music in real-time! 

Try out a live demo: https://midi-collaboration.herokuapp.com/

Check out the code: https://github.com/agbales/web-midi-collaboration

by Andrew Bales

As you can see, playing a note lights up a key in pink and displays the input data in the list. If another user joins the session, their input will light up blue and their data will appear in the list as blue entries.

We’ll break the process into 5 parts:

      1. Creating an App with Express.js
      2. Connecting a Midi Controller
      3. Adding Socket.io for Collaboration
      4. Styling
      5. Deploying to Heroku

Step 1: Creating an Express.js App

We’ll get started by making a directory and initializing the project with these three terminal commands:

mkdir midi-collaboration
cd midi-collaboration
npm init

The NPM utility will ask for a bit of information to set up the package.json file. Feel free to provide that info or just hit enter to leave these fields blank for now.

Next, add Express:

npm install express --save

In package.json, you will now see Express included as a dependency.

In the root folder, make server.js with:

touch server.js

In server.js, add the following:

var express = require('express');
var app = express();
const port = process.env.PORT || 8080;
var server = app.listen(port);
app.use(express.static('public'));

This creates an instance of Express and sets the port that defaults to 8080. app.use instructs express to serve up files from the ‘public’ folder. If you ran the server now, you’d get an error. That’s because we don’t yet have a public folder!

Public folder

Let’s make that public folder:

mkdir public
cd public

Inside, we’ll make index.html, index.js, and a CSS folder containing style.css:

touch index.html
touch index.js
mkdir css
cd css
touch style.css

The ‘public’ folder should look like this:

public
--> index.html
--> index.js
css
--> style.css

Index.html should link to style.css and index.js. So let’s add the following:

<!-- /public/index.html -->
<html>
<head>
<title>Midi Collaboration</title>
<link rel="stylesheet" href="/css/style.css">
</head>
<body>
<div>
<h1>MIDI Collaboration</h1>
<ul id="midi-data">
</ul>
</div>
</body>
<script src="/index.js" type="text/javascript"></script>
</html>

This makes a simple header followed by an empty unordered list — this is where we’ll log our midi data. After the body, it references index.js. For now, let’s add a simple console.log to index.js so that we’re sure it’s working properly:

// public/index.js
console.log('index.js is connected!');


Update package.json

Finally, we want to update package.json so that we can run our server with the terminal command ‘npm start’. Add the following line to “scripts”:

"start": "node server.js"

Your package.json should look like this:

{
"name": "remaking-midi-maker",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"start": "node server.js"
},
"author": "",
"license": "ISC",
"dependencies": {
"express": "^4.15.3"
}

Make sure you’re in the midi-collaboration folder and start up your app:

npm start

Great! It’s now hosted at http://localhost:8080/

Step 2: Connecting your Midi Controller

We’ll use the Web Midi API to connect a USB midi controller to the app. If you’re seeing ‘index.js is connected!’ in your console, let’s replace the console.log with:

// public/javascript/index.js
var context = new AudioContext();
var oscillators = {};
var midi, data;
if (navigator.requestMIDIAccess) {
navigator.requestMIDIAccess({
sysex: false
}).then(onMIDISuccess, onMIDIFailure);
} else {
console.warn("No MIDI support in your browser");
}
function onMIDISuccess(midiData) {
console.log(midiData);
midi = midiData;
var allInputs = midi.inputs.values();
for (var input = allInputs.next(); input && !input.done; input = allInputs.next()) {
input.value.onmidimessage = onMIDImessage;
}
}
function onMIDIFailure() {
console.warn("Not finding a MIDI controller");
}

If your midi controller is hooked up, you should see the console.log for the MIDIAccess object:

In onMIDISuccess(), a for loop listens for midi messages. Right now it should be causing an error in the console. Why? Because we haven’t defined what to do when it receives a midi message.

Let’s create the onMIDImessage function referenced in the loop:

function onMIDImessage(messageData) {
var newItem = document.createElement('li');
newItem.appendChild(document.createTextNode(messageData.data));
newItem.className = 'user-midi';
document.getElementById('midi-data').prepend(newItem);

This function creates a new <li> element. It appends a text node with our midi data. It adds a css class of user-midi (this will be important later). Finally, it adds that new list item to the unordered list with the id “midi-data”. 

Pretty cool, eh?

But what do these numbers mean? Also: where’s the sound?

MIDI Protocol

MIDI Protocol is a rabbit hole all its own, but for our purposes you can understand the numbers with a simple chart:

On / Off → 144 / 128

Pitch → 0–127

Velocity → 0–127

When working with this data, we’ll treat the first number like an on/off switch. 144 = on, 128 = off. The second number is the range of pitches. The final velocity input could also be understood as volume in our usage here.

If you’d like a more in-depth look at midi, here’s a good place to start.

Sound

The MIDI data is not the sound, but a set of directions: ON/OFF, PITCH, VELOCITY. We’ll need to make our own synth that can turn this information into musical tones.

First, let’s convert those value sets from an array into a ‘note’ object that we can pass to our sound player. In onMIDImessage function, add:

var d = messageData.data; // Example: [144, 60, 100]
var note = {
on: d[0],
pitch: d[1],
velocity: d[2]
}
play(note);

Above, the variable ‘d’ is assigned the incoming data: an array of three numbers. Those three values are accessed by their index and assigned as values for the object properties. The note object is then passed to the play function. Let’s write that function:

function play(note){
switch(note.on) {
case 144:
noteOn(frequency(note.pitch), note.velocity);
break;
case 128:
noteOff(frequency(note.pitch), note.velocity);
break;
}
    function frequency(note) {
return Math.pow(2, ((note - 69) / 12)) * 440;
}
   function noteOn(frequency, velocity) {
var osc = oscillators[frequency] = context.createOscillator();
osc.type = 'sawtooth';
osc.frequency.value = frequency;
osc.connect(context.destination);
osc.start(context.currentTime);
}
   function noteOff(frequency, velocity) {
oscillators[frequency].stop(context.currentTime);
oscillators[frequency].disconnect();
}
}

This function checks to see if the note is on (144) or off (128) and triggers the appropriate command. Both noteOn and noteOff reference two global variables that we established at the top of index.js to handle the starting and stopping of sound.

The frequency function is used to convert the midi note number into a hertz frequency. If you’re curious, you can read more about it here.

Your app should now play like a synth. Huzzah!

Step 3: Adding Socket.io for Collaboration

Now it’s time for the fun stuff: syncing multiple users in a session! Socket.io will help us do that by enabling ‘real-time bidirectional event-based communication.’ That means we can send and receive midi messages as they’re triggered on any browser as they occur. Add Socket.io to the project:

npm install socket.io --save

You’ll also need to add this inside the head of index.html:

<!-- public/index.html -->
<script src="/https://cdnjs.cloudflare.com/ajax/libs/socket.io/2.0.3/socket.io.js"></script>

Note: I’m using version 2.0.3 — be sure your versions of Socket.io match your package.json and html script. You might want this link to the CDN.

For Socket.io to connect users we need to assure that:

1) Users can both send and receive notes

2) The server listens for user input and broadcasts it to other users

User send/receive

First, let’s make sure the user emits a signal every time they play a note. This can be accomplished by opening index.js and adding a single line to the function onMIDImessage. Be sure to add this after the note object has been defined.

// public/index.js (inside onMIDImessage)
socket.emit('midi', note);

We also want to play any external notes that come in from other users. Inside index.js, add the following:

// public/index.js
var socket = io();
socket.on('externalMidi', gotExternalMidiMessage);
function gotExternalMidiMessage(data) {
var newItem = document.createElement('li');
newItem.appendChild(document.createTextNode('Note: ' + data.pitch + ' Velocity: ' + data.velocity));
newItem.className = "external-midi";
document.getElementById('midi-data').prepend(newItem);
 playNote(data);
}

When we receive an ‘externalMidi’ message from the server, it triggers gotExternalMidiMessage. This function should look familiar — in fact, we could refactor this later, but for now we’ll repeat code for clarity. It displays the external note in the view in a manner that’s almost identical to how we treat midi input from our own keyboard. However, we’ve given the <li> a class name ‘external-midi’. This will be important in a moment when we add styles to differentiate between our midi input and that of outside users.

Finally, the note is passed to the player to trigger a sound.

Server

Now let’s make a bridge between users. We want to handle any incoming signals and pass them to any other sessions.

In server.js, require Socket.io and add the function newConnection. This will be triggered when the server gains a new connection.

// server.js
var socket = require('socket.io');
var io = socket(server);
io.sockets.on('connection', newConnection);
function newConnection(socket) {
console.log('new connection from:' + socket.id);
 socket.on('midi', midiMsg);
function midiMsg(data) {
socket.broadcast.emit('externalMidi', data);
}
}

The newConnection console.log will appear in the terminal every time a new connection is made. We also have socket.on listening for messages from any user and then triggering midiMsg, which broadcasts that data to every user except the original user.

With that, we’re all set!

Step 3: Styling

If you’re just interested in seeing your notes differentiated from other players, you can take a shortcut here and simply add these classes to your style.css:

// public/css/sytle.css
.user-midi {
color: green;
}
.external-midi {
color: blue;
}

That’s it! Now you will see your own input in green and any external signal in blue. Feel free to skip to the next step and deploy your app to Heroku!

If you’d like to create a keyboard interface, let’s keep rolling:


Build a keyboard

This keyboard we’ll make builds off of @baileyparker’s CodePen project. We’ll make some design and functionality changes to fit our usage. To start, open up this pen in a separate window:

HTML

The div with the class ‘piano’ houses all of our keys, which are labeled with a variety of selectors. Follow the html structure in the pen and paste the piano div and all its keys into your document (/public/index.html).


CSS

This is a big update for our styles. We’re importing a google font, assigning a background image to the body, and finally giving colors to differentiate between .user-midi (pink) and .external-midi (blue) signals. The ul and li elements have been styled so that they’ll slant back at a pleasing angle.

The keyboard styling takes up the remainder of the CSS. Worth noting here are the ‘active’ classes like ‘.piano-key-natural:active’ or ‘.piano-key-natural-external:active’. These are triggered by the Javascript when a note is played. If it matches the data number, the CSS will activate that key to be pink for your notes and blue for any external input.

When you copy the CSS from the pen into your project’s style sheet (/public/style.css), be sure to follow the notes included inside. Most importantly, you’ll need to update the path to the background.

JS

You’ll do the same thing for the Javascript: cut and paste it into /public/index.js below the code we’ve written. Again, it is important to read and follow the few comments within the code.

This code will manage midi controller connections. But you’ll want to focus in on two functions: onMidiMessage & updateKeyboard. The former handles local input and applies the appropriate class to light up the keys. The latter does the same thing (but with a different class) for external messages.

To get external notes to light up the keyboard, we need to return to the function gotExternalMidiMessage. After we call play(), add the following code:

var msg = { }
msg.data = [];
msg.data.push(data.on);
msg.data.push(data.pitch);
msg.data.push(data.velocity);
updateKeyboard(data);

The keyboard needs a specific kind of object data structure to update, so we create msg and fill it with data from our note. That msg object is passed to updateKeyboard which lights up the keys in blue for that external signal.

You should now see your keys light up pink for your own input! When we connect to Heroku add more users, you’ll see those external midi messages light up in blue!

Step 4: Deploy to Heroku

Instead of walking through each step here, I’ll direct you towards the comprehensive documentation at Heroku. They’ve done a great job reviewing options for deployment:

Deploying with Git | Heroku Dev Center
Git is a powerful decentralized revision control system, and is the means for deploying apps to Heroku.devcenter.heroku.com


Summary

If all went well, you should now have an app that passes signals from one user to the next and plays notes from every user. The keyboard should light up with different colors for your input and that of external sources. 

 I’m excited about this collaborative app. It has been a real thrill hooking it up and inviting friends from other states to log on and play music! Looking back, I also see ways that the data could be passed more efficiently and areas where the design could be enhanced — it’s not friendly to smaller screens, for instance.

I’d love to hear how you’re using the app, and I hope it has given you a better understanding of how to combine Express.js, Socket.io, and the Web MIDI API!

A Web Developer’s Tale of the Octapad Revival

Two months ago, I was tasked with presenting a talk on the Web MIDI API to an amazing crowd of music-loving web developers. Obviously, I wanted the presentation to be exciting, so my first idea was to find a cool MIDI controller that I could use to demo the API with. What kind of controller could fit the bill? What about an old… keytar? Yes, that would be awesome! Nobody’s using those anymore so they will be dirt cheap, right? So I hit eBay and here’s what I found…

An original Yamaha KX-5 carefully laid out on purple velvet. It couldn’t get any more perfect than that, could it? But wait, are they really asking 350$ for it? Plus 130$ for shipping! That’s 500$ US dollars… which means about 630$ in Canadian dollars! Whoaaa, there is no way I’m spending that kind of money on a 45-minute presentation.

What then? Hmmmm.

This is when I remembered the era before I became a web developer when drumming was my whole life. More importantly, I remembered that I already had a cool MIDI controller in my possession: my good old Roland Octapad II. The Octapad is an eight pad percussion controller made by Roland starting in the 80s. How about using that for the demo? That would be cool. But where on Earth did I put this thing? After a few hours of searching, I finally found it hidden in the garage. I plugged it in and it powered up. Oh yeah! I started playing it and then I realized that time had taken its toll. No matter how hard I stroked the pads, the hits barely registered. Damn, this thing is busted. Should I be surprised? After all, this device is at least 25 years old and has been sitting in a damp garage for over 10 years.

But still, it would be so cool to use it for the conference demo. So I started digging around on the Internet to see if this thing could be revived somehow. After an hour or so of reading outdated forum posts and barely-related blog articles, I stumbled upon a post from this guy who said the problem is easy to fix. The piezos are dead he said. Just get new ones and you will be good to go. What have I got to lose, right?

A bit of Googling tells me that piezos are simple vibration-sensitive sensors, precisely the kind you would expect in a percussion controller. Because I had no idea what kind or size of piezos I should buy, I decided to open the unit up. Surprisingly, this was very simple. All it took was a Phillips screwdriver and I was in.

I soon realized that it was a good idea to first check inside. Being an 8-pad controller, I was expecting to buy 8 piezos. However, as you can see on the picture above, 10 piezos are needed. I’m guessing the extra 2 are used to counter any crossover that could happen between the pads through the casing. Furthermore, opening it allowed me to measure the size of the piezos. In the end, I ordered twelve 35mm piezos from Digikey. I bought an extra 2 to be on the safe side.

While getting inside the unit wasn’t hard, getting to the piezos was a little bit harder. As you can see, two of the piezos are hidden under a board which needs to be removed in order to gain access to them. As a matter of fact, the whole unit pretty much needs to be dismantled to be able to extract the old piezos and put in the new ones. If you are attempting this operation, I urge you to do as I did and take pictures all the way through the operation. This way you will know which screws (there are various types and lengths), connectors and daughterboards go where. I’m soooo glad I did!

The dismantling operation wasn’t hard but I was extra careful in the way I handled all the various pieces. I didn’t want to lose anything or forget where something was going. Then, at one point, I realized that the only way to go further was to actually desolder the piezos from the central board strip (a.k.a. Pad-8 Sensor Board). This was the point of no return. As you can see in the picture below, in order to remove the sensor board and get to the pads, you must desolder the piezos, there’s no way around it. So I did. 

Note that all the white wires are connected to the board’s center strip while the black wires have their own strip leading to separate cables on the left side. This makes sense: one common ground for all and separate signal wires for each pads.

Once the sensor board is removed, you can unglue the piezos from the pads. This can be quite scary. Especially when all you are left with is a pile of dead piezos and the conviction that you are never going to be able to put all this back together…

As you can see above, the piezos are affixed to the pads using some sort of double-sided tape. I had no idea which kind of tape was appropriate so I bought Scotch-Brand 3M 1″ Permanent Mounting Squares (cat. 111C). If you use them, you will want to trim the corners of the squares so they fit snuggly inside the center circle of the piezo. The idea is for the outer ring to vibrate freely.

Obviously, you also need to solder the new piezos to the sensor board. The piezos I bought came with short and flimsy wires which I did not trust. So I opted to use sturdier wiring. I had some speaker wires on hand so I used that. Just be sure to make the right connections. The inside ring of all piezos should be connected to the shared central strip on the sensor board while the outer rings of each piezos should be connected to their own individual strip. In the end, my soldering job was a bit messy but I made sure the connections were solid and not touching other conductive strips (this is very important!).

I then put the Octapad back together, plugged it in and crossed my fingers… Guess what? It worked. In fact, it might now be working better than ever before. Nice.

Obviously, I used it during my talk and I had a blast. Attendees also had a great time witnessing how a 25 y/o piece of hardware (brought to this world before the Internet was even invented) could trigger sounds and visuals inside a web page running in Google Chrome.

The moral of the story, I guess, is that well-designed and proven technologies can, and often do, withstand the test of time. They might need a little love along the way but don’t we all?

If you are musician paying the bills doing web development work, I urge you to dust off your old MIDI devices and hook them up to your browser. You will be amazed at what can be done with the Web MIDI API. If you are curious to know how this is possible, check out the library I created that makes it very easy to use the Web MIDI API. You can also take a look at the slides from my presentation.

Keith McMillen combines Leap Motion and Web MIDI

Keith McMillen Instruments shared this short demo of gestural mixing, using their K-Mix programmable mixer, a Leap Motion controller and Web MIDI.

MIDI Gestural control really seems to be taking off recently enabled by wireless BTLE MIDI, improvement in gesture recognition technology and advances in sensors. In this case, KMI is using a Leap Motion controller and Web MIDI to do gestural mixing. 

Leap Motion allows you to get a staggering amount of detail and data from the movement of your hands, fingers, and joints, but in this project, we’re only interested in three things:
1. which hand we’re using
2. what finger we’re using
3. is our hand is closed to a fist

by ANDREJ HRONCO

KMI has a great series of blog articles about Web MIDI.  Here is a summary and links to that series. 


...

K-Mix API, Part 2 – Controlling K-Mix with Leap Motion and the Web MIDI API | Keith McMillen Instruments

If you’ve ever fantasized about mixing with your hands similar to how you control a Theremin or in the movie Minority Report, your dreams are now closer to reality! With the latest K-Mix update, which enables sending MIDI message to fully control K-Mix, combined with the the K-Mix API and the incredible Leap Motion controller, I’ll demonstrate how you can control K-Mix with only your hands and fingers, without touching K-Mix at all!. K-Mix used with the K-Mix API brings a new meaning to the term ‘Programmable Mixer’


...

K-Mix API, Part 1 – Web MIDI Control Surface | Keith McMillen Instruments

One of K-Mix’s most powerful features is its ability to be programmed and controlled from any DAW. With the power of Web MIDI in your browser and the K-Mix API, K-Mix becomes the first audio mixer that’s fully controllable with JavaScript. In Part 1, I’ll go over the basics of using the K-Mix API and using K-Mix as a control surface for a web app.


...

Making Music in the Browser: Web Audio/MIDI – Amplitude Modulation | Keith McMillen Instruments

Amplitude Modulation is a simple concept that can yield harmonically rich and bizarre timbres not easily achieved via other methods. In this article we’ll expand on the topics introduced in Simple Synthesis – Amplitude Modulation and play around with the concepts behind AM Synthesis. Modulation sources do not need to be low-frequency oscillators or envelope generators.


...

Making Music in the Browser: Web Audio/MIDI – Envelope Generator | Keith McMillen Instruments

In our first Simple Synthesis Addendum we learned how to connect a VCO to a VCA and control their ‘frequency’ and ‘gain’ AudioParams using a MIDI controller via the Web MIDI API. Good stuff! We now have a simple synth we can play. In this post we’ll learn how to shape our notes by building a Envelope Generator with configurable attack, decay, sustain and release using the Web Audio API’s scheduling methods. We’ll also give our Envelope Generator a ‘Mode’ setting, which will give us the ability to create some really long envelopes to play with.


...

Making Music in the Browser: Web Audio/MIDI – VCO/VCA | Keith McMillen Instruments

Emmett Corman has a great introductory series on the basics of synthesis (using modular synths), called Simple Synthesis. I thought it would be of value to those without access to the hardware to be able to explore and interact with the concepts that Emmett covers directly in the browser, using the Web MIDI and Web Audio APIs.


...

Making Music in the Browser – Web MIDI API | Keith McMillen Instruments

This opens up a huge variety of possibilities for not only art and music in the browser, but also allows any hardware that uses MIDI as its communication platform to control and be controlled by your browser. MIDI.org itself says “the Web-MIDI API is the most significant advancement of MIDI since… MIDI itself!”


...

Manipulating MIDI with Pure Data | Keith McMillen Instruments

Working in technical support for KMI, I am often confronted with requests for functionality from some of our devices that we didn’t include for one reason or another. There’s a variety of tools that you can use for this task, such as Max, Bome’s MIDI translator or Pure Data. All of these software environments can do wonderful creative things with MIDI information, but one of them has a defining characteristic; Pure Data is free, open-source, cross platform software. This means that you can make a solution by yourself, today, for free, that can run on Mac, Windows, Linux and even on a Raspberry Pi. In this article (and maybe more) we’ll look at the basics of manipulating MIDI in Pure Data to give the SoftStep2 four triggers on each pad.


...

Sequencing the SEM: MIDI to CV Conversion | Keith McMillen Instruments

The best way to unlock the hidden potential of any semi-modular synthesizer is by sequencing. This post will describe the necessary steps to use the QuNexus as a MIDI-to-CV converter along with Ableton Live to control the legendary Oberheim Synthesizer Expander Module.
Before you begin, make sure you have an SEM


...

Interfacing with MIDI Hardware Using M4L: The Meeblip Anode | Keith McMillen Instruments

With the advent of small, portable, MIDI enabled analog synthesizers, computer musicians now have the option of controlling a lot more than VSTs. If you have a USB MIDI interface (or other means of getting MIDI from your computer to 5 Pin MIDI connections), you can leverage the sequencing power of Live to control all of the MIDI enabled parameters on your synth. In this article, we’ll look at how to create a MIDI device using M4L that allows us to access all of the parameters on the Meeblip Anode, as well as how to add modulation sources to allow for more sonic options.

About Web MIDI

The Web MIDI API connects your MIDI gear directly to your browser. 
Your browser connects you to the rest of the world.

MIDI hardware support has been available for a long time in Windows, Mac OS, iOS and most computer/tablet/smart phone platforms through USB, WiFi and even Bluetooth interfaces. But until now, there has been no standard mechanism to use MIDI devices with a Web browser or browser-based Operating System.

The Web Audio Working Group of the W3C has designed the Web MIDI API to provide support for MIDI devices as a standard feature in Web browsers and operating systems across multiple hardware platforms.

Google has led the way to support the inclusion of MIDI in the Web platform, both contributing to the specification and by shipping the first implementation of the Web MIDI API (in Chrome v.43 for Windows, OSX, and Linux), continuing to demonstrate the company’s interest in helping musicians interact with music more easily using the Web.

Being able to connect to local MIDI hardware will increase the creation and distribution of music-making applications for PCs, tablets and smart phones. It also means that popular MIDI hardware can be used to control any kind of software in the browser (using physical buttons and knobs instead of on-screen sliders, for example).

For hardware device makers, instrument control panels and editor/librarians which previously needed to be produced in multiple versions can now be implemented once in HTML5, and consumers can run them on any Web device (tablet, computer, or smart phone) and even “live” over the Web.

And finally, since the browser is connected to the Internet, musicians can more easily share data and even connect music devices over a network. 

Where will Web MIDI take us?

Web MIDI has the potential to be one of the most disruptive music technologies in a long time, maybe as disruptive as MIDI was originally back in 1983. Did Dave Smith and Ikutaro Kakehachi envision a world with 2,6 billion MIDI enabled smart phones. Definitely not! 

Here is an interesting video of someone using a commercially available MIDI controller to play a browser based game. it just makes you think about all the possibilities in the future.

Here are some links to more Web MIDI resources.


...

BandLab: Music Starts Here

Develop your songs from inspiration to finished projects with our best-in-class Mix Editor. Pull from thousands of available beats and loops in our extensive library or connect an interface and record live audio.


...

Soundation — Make music online

Make music in your browser and collaborate with anyone on Soundation, a one-stop shop for audio samples, instruments, and effects.


...

Soundtrap Press – Images & Videos

Soundtrap is the first cloud-based audio recording platform to work across all operating systems, enabling users to co-create music anywhere in the world. The company is headquartered in Stockholm, Sweden. Soundtrap provides an easy-to-use music and audio creation platform for all levels of musical interest and abilities and is being used by the K-12 through higher-education markets. On December 2017, Soundtrap was acquired by Spotify. For more information, visit http://www.soundtrap.com

Soundmondo

Social Sound Sharing with Web MIDI for Yamaha Synths


...

Noteflight – Online Music Notation Software

Noteflight® is an online music writing application that lets you create, view, print and hear music notation with professional quality, right in your web browser.


...

Online piano lessons – Learn piano your way

Learn to play piano with online, interactive lessons and tutorials. Our in-depth courses will adapt and give you feedback. Play your first melody in minutes.


...

Play Drums Online – online rhythm game

Play drums online is an online rhythm game where you can learn to play along with the best songs. Set a new high score or practice your drum skills with your favorite artists and songs.


...

WebSynths : the browser-based microtonal midi instrument

websynths.com is a FREE, browser-based musical instrument, optimized for microtonal experimentation on multi-touch devices.

Here are some Web MIDI links for developers


...

Web MIDI API


Ryoya Kawai’s web music developers appsot site with information in Japanese, English and Chinese.


...

Web MIDI: Music and Show Control in the Browser – TangibleJS

Chrome 43 officially introduces an amazing new feature: MIDI in the browser! For fans of physical computing, this is big news. Take advantage of it!


...

Keith McMillen combines Leap Motion and Web MIDI –

Keith McMillen Instruments shared this short demo of gestural mixing, using their K-Mix programmable mixer, a Leap Motion controller and Web MIDI. This article has links to all the great Web MIDI articles on the KMI site. 

Google Web Audio/MIDI Hackathon at Music China 2015

A special Web Audio/MIDI Hackathon was organized for web developers by Google at the Google Shanghai office. More than 70 developers from across China, from as far as Beijing and Zhangjiakou in northern China, and even a developer from the Netherlands, came to attend the event, which was lead by Ryoya Kawai and Encai Liu of Yamaha, to train the developers about Web Audio and Web MIDI APIs. Bill Luan from Google’s Developer Relations team provided developers with MIDI devices to use (courtesy of Google, Yamaha, Korg, and CME). At the Hackathon, Mr. Kawai and Mr. Liu answered many questions from developers. Some of the developers attending the event demonstrated existing applications, from music-making to art installations, all using MIDI and audio technologies, to inspire others for the development. Attendees were eager to get their application completed for the contest, which will be judged by music and web industry experts from Yamaha, MMA, and Google.