fbpx
Skip to main content

MMA @ The 2014 NAMM Show

Beyond the MIDI Mockup: Achieving maximum depth, subtlety, nuance and expression using MIDI hardware and software.

This two-hour HOT Zone workshop session by composer Jerry Gerber provided tips and techniques for musicians, composers and sound designers to make realistic MIDI recordings using popular DAW software such as Sonar by Cakewalk. A more detailed session description is in a separate area of the site linked below. 

MIDI Innovations Exhibit-

We followed our 2013 celebration of the incredible history of MIDI by showcasing some new and innovative MIDI products for 2014 NAMM:

Starr Labs showed their AirPower Series wireless transceivers for MIDI and USB-MIDI devices, along with Clipper, a wireless light-bar controller optimized to control Ableton sessions. Also on display was their iTar, a version of the acclaimed Starr Labs professional fret-board button system made for Android and iOS mobile devices. More info on all of these products is available at Starrlabs.com.



Kiss Box showed the CM-MIDI low-cost MIDI transceiver for RTP-MIDI networks, which uses the open standard RTP-MIDI protocol to connect multiple MIDI devices using Ethernet cables (and WiFi, via an access point). They also showed CV-Toolbox, an RTP-MIDI interface for analog synths with CV (control voltage) interfaces. More info at kiss-box.com.

MMA Annual General Meeting 2014

MMA President Tom White reviewed some of MMA’s activities for the 30th Anniversary of MIDI in 2013, and explained that MMA’s advocacy efforts, along with interoperability efforts, encourage the use of MIDI technology across many industries and applications. MMA also is responsible for making sure that the name “MIDI” is only identified with industry-approved and interoperable solutions. He said MMA exhibits at CES and NAMM have been helpful with making MMA accessible to people who are interested in engaging with MMA, as well as with building familiarity with people that MMA may want to engage with in the future.

Tom reported on a number of market development projects, including MIDI in China. Zhao Yitian (CME Pro) presented a letter from the Chinese Musical Instrument Association (CMIA) announcing formation of a MIDI Industry Committee to promote MIDI in China and support international cooperation. CMIA also invited MMA to participate in 2014 Music china by producing seminars, lectures, demos, etc. Tom said he and Mr. Zhao had spoken to many Chinese manufacturers of MIDI products at 2013 Music China, inviting them to become MMA Members.

Tom then provided a status update on MMA technical projects, including the MIDI Specification for IEC and the Web MIDI API. He also reviewed recent discussions between MMA and OS partners (Apple, Microsoft, Google), and explained that many developers are interested in doing MIDI over Bluetooth and there would be a separate session on that topic in the afternoon. Following the General Session the MMA members met in a private session and voted to adopt an Updated MIDI Electrical Specification (which is pending confirmation by AMEI). Members also reviewed current proposals and considered suggestions for new work items.

The AGM afternoon sessions were promoted as a MIDI Developer Conference in order to include companies that are not MMA Members. Topics discussed, and the companies that presented, were: MIDI over Bluetooth (Apple, Yamaha, Miselu); Web MIDI API (Google, AMEI, Livid); AVB-MIDI and USB-MIDI (Roland).

The Computer Chronicles: MIDI Music (1986)


This 30 minute TV show from 1986 is a window into the state of the art in MIDI music technology back then. 

Guests: Chris French, Music Software; Bob Moore, Hybrid Arts; David Schwartz, Compusonics; Chris Potter, Mimetics; Curtis Sasaki, Apple; Gary Kildall, Digital Research; Gary Leuenberger, Midi Revolution

Products/Demos: Casio SK-1 Synthesizer, Atari ST, Activision’s Music Studio, ADAP Sampler, DSP-1000, Apple II GS, Ensoniq Sound Chip, Soundscape, Commodore AmigaEZ Track, Kidnotes


MMA at 2013 International CES – January 8-11, 2013

​This year at CES we previewed our MIDI Makes Music promotional campaign, sponsored by a coalition of leading industry companies and organizations including, Roland, Yamaha, Gibson, Fishman, Dave Smith Instruments, Korg, Dream, Mediamation, NAMM, AES, Keyboard Magazine, Electronic Musician Magazine and Robertson Communications.

MMA CES Booth 2013

Our booth featured graphics and handouts explaining that MIDI is a Grammy®-winning technology that enables people to use electronic musical instruments along with computers and mobile devices to compose, record, notate, arrange, perform and learn about music.

We also displayed our new “MIDI Makes Music” introductory video about the past, present, and future of MIDI technology. The video describes how MIDI was developed in 1983 and how it has become ubiquitous in music creation and production over the past 30 years.

MMA at 2014 International CES – January 7-10, 2014

MIDI Makes Music

CES 2014 we continued our MIDI Makes Music promotional campaign with booth graphics and handouts explaining that MIDI is a Grammy®-winning technology, with photos and descriptions of typical applications and products that use MIDI technology. .

​We also displayed our “MIDI Makes Music” introductory video about the past, present, and future of MIDI technology. The video describes how MIDI was developed in 1983 and how it has become ubiquitous in music creation and production over the past 30 years.

A3E – Advanced Audio + Applications Exchange Boston, 23-24 September, 2014

OVERVIEW

A3E – the Advanced Audio + Applications Exchange is the first major industry conference focused on Audio Development and true Next–Gen Technology Innovations.

A3E features three conferences, which bring Artists, Developers and Manufacturers together to explore the impact of Algorithmic Intelligence, Mobile Technology, Controllerism & DJ Tech, High Performance DSP, & Cloud Technologies:

.

SUMMARY

A number of larger MMA member companies including Google, Microsoft, Analog Devices, and Roland had exhibits and/or had representatives speaking on panels. Many smaller, forward-thinking software companies also exhibited or participated in the panels. Athan Billias attended on behalf of both Yamaha and MMA, and participated in some panels on behalf of MMA (to promote MIDI to developers).

The MMA was well represented in the program and all the signage. Dave Mash, VP of Technology at Berklee College of Music (and a member of the A3E Advisory Board) mentioned MIDI several times in his keynote presentation. Attendees appeared to be well-aware of MIDI, but not necessarily about how MIDI is evolving.

The session titled “The Next 12 Months of Advanced Audio Development: Audio In The Web Platform” was attended by about 50 people, most of whom appeared to be developers. The panel featured Chris Wilson from Google, who is the developer advocate at Google who has been pushing the standardization of the audio API and MIDI API in browsers. Chris did a great job explaining why these APIs are going to be important. Joe Berkovitz from Note Flight (notation software that runs in a browser) did a demo where he pulled out a MIDI keyboard and edited the notes via the network, which drew some oohs and aahs from the audience. Athan Billias, speaking for MMA, explained that browser-based apps are cheaper to produce and support since developers don’t need to have separate code for MACs and PCs. Browser apps also can be accessed anywhere there is an Internet connection, and have all of the added social and tracking aspects of any web page.

During Q & A at the Web MIDI panel Athan was asked if there was “something after MIDI”. Rick Cohen (MMA HD Protocol Working Group Chairman) happened to be in the audience so together Athan and Rick explained that MMA was working on developing a new protocol that wouldn’t replace MIDI but would extend it and work with it because people would still need and want to use the millions of MIDI 1.0 device that already exist. The audience seemed very interested in the idea and quite positive about the news.

As a whole, people at the event seemed generally happy with MIDI for what they are doing, but some would like to see more capabilities. Some people did mention OSC as a possible alternative, but acknowledged that there is no interoperability with OSC. Throughout the event there were also times when people brought up the issue of improving audio performance on Android.

[Report prepared by Tom White from notes prepared by Athan Billias.]

MMA @ The 2015 NAMM Show January 22-25 2015 Anaheim CA USA

MIDI Innovations Exhibit Hall E #1086

Exhibitors were:

  • Sensorpoint: “Jambe” iOS Percussion Instrument
  • Bome: MIDI/USB/Ethernet/WiFi Translator Box
  • Ubisoft: “Rocksmith” guitar training & game software
  • Quicco: Wireless (BLE) MIDI Adapter “MI.1”
  • Psicraft: Synth Editing Software and Juno-106 MIDI Kit

There was also a MIDI JAM Area where attendees could play some of the latest MIDI-equipped instruments from Roland,Yamaha, CME, Muse Research, YouRockGuitar, and others (connected through a JamHub mixer).

HOT (Hands-On-Training) Zone Sessions

“Game Audio 101”

In this fun and informative presentation, game audio gurus Steve Horowitz and Scott Looney explained the evolving playing field of sound for games, highlighting both the business and technical challenges. Topics included Social Media, Mobile, and AAA games, with demonstrations of development tools such as Unity 3d, FMOD Studio, Fabric, and Master Audio.

​Sponsored by

Sponsored by


“Dynamic Music for Games”

This talk explained the techniques and strategies for producing dynamic music scores. Topics presented included: Differences between interactive and dynamic music; Classic dynamic/interactive music forms; 10 Methods to reduce repetitions in music; Strategies to overcome technical constraints and limitations of current platforms. Scenarios were shown using Wwise, a sound engine used by hundreds of software titles including Ubisoft’s Rocksmith 2014 which was also being demoed in theMIDI Innovations booth (Hall E #1086).

​Sponsored by

MMA Annual General Meeting & Breakout Sessions

The Members of MMA meet annually during each Winter NAMM Show to share information and plan activities for the upcoming year. The Annual General Meeting of MMA Members began with the General Session at 9:00 am for members and invited guests. MMA President Tom White provided an overview and current status of each Technical Proposal being discussed in MMA:

  • #191 HD Protocol / #211 HD Tunnel SysEx Message (MIDI 1.0)
  • #200 IEEE AVB MMA Payloads
  • #202 MIDI Electrical Spec. Update
  • #203 IEC MIDI Specification
  • #206 Polyphonic Legato Messaging
  • #207 Arpeggiator/LFO Sync Recommendations
  • #208 USB MIDI Update
  • #210 Bluetooth MIDI

Tom also mentioned the importance of OS support for MIDI and described some of the issue and opportunities being addressed by Apple, Microsoft, and Google in cooperation with MMA.

Tom then explained that with the HD Protocol Specification nearing completion it was necessary for interested product developers to consider how to make their products interoperable, and how consumers will be able to figure out what to buy and how systems will work. He said HD capabilities go way beyond those of MIDI, so interoperability will be a bigger challenge, which may need to be addressed with additional device specifications and compliance testing. He pointed out that users and retailers will also need to be educated to explain and use HD products correctly. He added that HD Protocol was not intended to replace MIDI protocol but already people on forums have been talking about “MIDI HD” and “New MIDI” so some effort must also be made to control the message. Tom said MMA operates by consensus so won’t simply mandate the rules for marketing HD products, but would ask all potential developers to decide. He said the first such discussion opportunity would be an afternoon session of the AGM (see below).

Tom went on to explain that interoperability testing and market promotion/education are costly, and that MMA’s current membership revenues are not sufficient to fund those activities (nor any other new projects). Tom said part of the problem is that many companies in the MI industry perceive MMA as a “standards body” and are not interested in working on standards. It is also true that many companies who are not “Manufacturers” believe they are not the intended constituency for the MMA. Tom said changing these perceptions could encourage more companies (and even individuals) to support MMA’s market development and education efforts, if there were a way for them to do so separate from technical standards work. Tom said the Executive Board believes it is possible (and necessary) to address these challenges by rebranding the MMA (e.g. as the MIDI Association) and adding new classes of membership such as teachers, retailers, etc. He said the Board is looking for key people to join Advisory Committees representing these new membership classes, as well as regional committees (EU, China) and market area committees (DJ, lighting) to make sure that all stakeholders are represented.

Executive Board Members Bryan Lanser and Denis Labrecque spoke about the current state of MIDI education. Bryan said that some MI industry trade schools are not providing graduates with sufficient MIDI knowledge to be employable at MI companies, because there are no standards for what should be taught. Denis announced he would lead and an MMA project to work with educators to establish standards for MIDI education that are endorsed and marketed by MMA (including certifying teachers and/or schools and/or students). Tom White added that misinformation about MIDI is also rampant on musician forums, and even among industry people, in part because most people learned about MIDI 25 years ago, and so don’t know about new messages, transports, or applications. Tom observed that manufacturers stopped including informational materials product boxes back in the 90’s, so an entire generation of consumers has come along since that has not really been told much about MIDI. He said promoting accurate and current information about MIDI should be a priority for all stakeholders in MIDI, and could help MMA attract new members.

Tom then reported on MMA’s participation in Music China (Shanghai) last October, and said he was looking forward to the 2nd Annual “MIDI World” event this year. He also briefly described the MMA exhibits at CES and NAMM this year, and how helpful they have been at exposing more people to MIDI and to the MMA.

After electing directors (Executive Board and Technical Standards Board) the members met in a Technical Session to review open proposals for additions and clarifications to the MIDI 1.0 Specification, and discuss other issues.

In the afternoon, an HD Protocol Business & Marketing Issues Discussion was held, as explained above. Following that discussion interested parties met to learn more about eScore Standards Activity in W3C and IEC, and to discuss methods for performing Polyphonic Legato playback with MIDI messages.

Older MMA Press Releases-1996-1998

Industry Players Agree on Common Format for Wavetable Synthesis (10/98)

At the 45th meeting of the Moving Picture Experts Group (MPEG), audio experts reached agreement on a common format for creating sounds with the popular “wavetable synthesis” technique. The agreement merges the best features of different specifications promoted by the MIDI Manufacturers Association (MMA) and Creative Technology Ltd. into a single format. This new format is called Downloadable Sounds Level 2 (DLS-2) and is known within MPEG as the MPEG-4 Structured Audio Sample Bank Format.

The DLS-2 format is an extension of the MMA’s DLS-1 format; it includes new features requested by MPEG, Creative, Microsoft, the MIT Media Laboratory, and MMA members. “The agreement on DLS-2 increases consumers’ access to high-quality interactive music content,” said Tom White, president and chief executive officer of the MMA. “Compatibility among multiple vendors was the compelling reason behind the MMA’s DLS Level 1 Specification, and the MMA is very pleased to have had this collaboration with MPEG in establishing a single new standard with an even higher level of performance.” The MMA is a nonprofit association which produces standards and recommended practices for musical instruments and digital music devices.

Wavetable synthesis is a popular and widely-used method for creating music in multimedia presentations, video games, and on the Internet. Short samples of recorded sound (wavetables) are accessed with MIDI instructions; this method provides the realism of recorded sound but at a significant savings in file size. Most multimedia PCs use this method for creating sound, and even more powerful versions are used by professional musicians to produce film scores and popular music recordings. The DLS format makes it possible for musicians composing for Internet or CD-ROM applications to use sounds of their own design, rather than limiting their compositions to the 128 General MIDI sounds that are typically available on multimedia computers.

“Microsoft is a firm believer in the value of open standards,” said Kevin Bachus, product manager for DirectX at Microsoft. “We made a commitment early on to provide support in the Windows operating system for the DLS format, and are pleased have had MPEG’s collaboration in delivering a more advanced DLS standard to hardware manufacturers, software developers and composers. With the DLS support included in the DirectMusic application programming interface, musicians and programmers can easily add interactive music capabilities to applications developed for the Windows operating system.”

“At Creative, quality and flexibility of sound synthesis is paramount,” said Dave Rossum, founder of E-Mu Systems and chief scientist of Creative Technology Ltd. we always give the strongest attention to the quality and flexibility possible in sound synthesis. “The new wavetable format will provide the best features and sound quality for use by all PC-based and Internet musicians. The advanced capabilities of Creative’s popular SoundFonts 2.0 format are included in the new standard.”

The harmonized format is one component of a powerful and flexible suite of tools in MPEG-4 called Structured Audio, contributed for free to MPEG by the MIT Media Laboratory. In MPEG-4, wavetable synthesis can be used in conjunction with general-purpose software synthesis and mixed with compressed vocals or the sounds of natural musical instruments. The resulting soundtracks may be transmitted as part of a virtual-reality experience or used as accompaniment to interactive video presentations on the Internet.

MPEG-4 Structured Audio is based on a powerful sound-description language for very-low-bitrate coding of synthetic music and sound effects and “3-D” positional audio. MPEG-4 was ratified as a Final Draft of International Standard at the meeting in Atlantic City, and will be published in December. For more details on MPEG-4 Structured Audio, please visit http://sound.media.mit.edu/mpeg4.

“MPEG, as part of the international standardization community, is the ideal forum to bring together representatives from industry to arrive at common understandings and to agree on the best technical solutions,” said Leonardo Chiariglione, chairman and Convener of MPEG. “Everyone who cares about music synthesis should applaud the forward thinking of the companies involved. We are particularly grateful for the productive relationship with the MIDI Manufacturers Association.” MPEG is a subdivision of the International Organisation for Standardisation (ISO) chartered with the development of new standards for audiovisual compression and transmission.

The Downloadable Sounds Level 2 (DLS-2) specification is available from the MIDI Manufacturers Association.

SoundFonts is a registered trademark of Creative Technology Ltd.Downloadable Sounds and DLS are trademarks of the MIDI Manufacturers Association.

Manufacturers Endorse “3dDxp” Audio Acceleration API (7/97)

A “Who’s Who” group of 11 leading PC multimedia technology companies have endorsed “3Dxp”, a means for extending Microsoft’s DirectX 3.0 API to enable hardware acceleration of 3D audio in PC games. The 3Dxp DirectSound 3.0 API Extension is an open, royalty-free specification developed by members of the 3D Audio Working Group of the IASIG. The IASIG is comprised of 250 hardware and software developers as well as music composers and sound designers with a common interest in improving the quality of sound for interactive media, and is a project of the MIDI Manufacturers Association.

Without hardware acceleration, applications must use the host processor for 3D calculations via DirectSound 3D. The resulting high workload placed on the CPU is likely to result in lower quality audio processing or reduced visual frame rates. The solution is to use dedicated hardware to accelerate the 3D functions, yet each of the current 3D hardware implementations has its own interface. “It seemed to us that the largest obstacle developers faced in using 3D sound was picking between all the incompatible and competing technology suppliers”, said Tom White of the Interactive Audio Special Interest Group (IASIG) which released the specification. “What 3Dxp intends to do is level the playing field so that software developers will jump in and start using 3D audio in games.”

The 3Dxp specification has been made available to the public on the IASIG web site (http://www.iasig.org). Reference source code is available for developers from DiamondWare (athttp://www.dw.com/DEV3D).

Downloadable Sounds Specification Approved (4/97)

The MIDI Manufacturers Association (MMA) announced a delay in releasing the Downloadable Sounds Level 1 (DLS-1) Specification approved by the MMA membership in January and previously expected to be available in April. “During our normal 60 day comment period we received requests for some modifications which would improve performance on existing products, and we felt a short delay was warranted” said MMA president Tom White.

The DLS Level 1 specification has been eagerly anticipated by music and game software developers for achieving consistent and predictable playback of interactive sound tracks. Originally proposed by the MMA’s Interactive Audio Special Interest Group (IA-SIG), DLS-1 allows composers to deliver customized instrument sounds and sound effects to accompany MIDI data which can be played on DLS-compatible sound cards or software synthesizers.

DLS-1 was approved for adoption by the MMA membership in January 1997 during the MMA’s annual meeting held in conjunction with the NAMM International Music Products Industry trade show. The specification is now expected to be published on June 1. The MMA will also make available software developer kits which will include a tool for authoring DLS-1 files called “DLS Synth/Author”.

Downloadable Sounds Specification Established (5/96)

The MIDI Manufacturers Association (MMA) today announced the creation of a new advanced audio standard for multimedia hardware. Targeted for CD-ROM and Internet entertainment applications, the new specification will result in higher quality audio from wavetable synthesizers, without any incremental memory costs. The new industry standard Downloadable Sounds (DLS) format was developed in cooperation with members of the Interactive Audio Special Interest Group (IA-SIG) and by leading multimedia companies.

The Downloadable Sounds specification extends General MIDI by providing a means for game developers and composers to add their own sounds to the PC sound card, rather than relying on the fixed GM sound set. General MIDI is used in PC games for generating music scores, and is also very popular with musicians and hobbyists who use MIDI for composing or learning about music. With DLS, custom sounds can be created and existing instrument sounds can be augmented with special effects by simply downloading a new sample bank.

“Inconsistent and proprietary designs have stalled the widespread adoption of wavetable synthesis.” said Tom White, President, MIDI Manufacturers Association. ” DLS 1.0 is the industry standard that will make wavetable and MIDI ubiquitous on mainstream consumer PCs. Consumers will experience enhanced interactive sounds beyond anything available today on the PC, and the title composers can rest assured that the consumer will hear exactly what was intended.”

According to composer and recording artist Thomas Dolby, now President and CEO of Headspace, “Downloadable Sounds will give composers a universal delivery system for great-sounding music in computer games… I’m done apologizing for a string section that sounds like a squished bug!”

“We are excited about the MMA DLS 1.0 format,” said Eric Engstrom, DirectX Program Manager, Microsoft Corporation. “In our effort to provide the industry with a suite of high performance standards and compliant APIs for gaming and Internet applications, we intend to support DLS 1.0 in the DirectMusic API.”

Initial sound developer tools will be provided by Sonic Foundry, best known for “Sound Forge” sound editing software. Monty Schmidt, CEO, Sonic Foundry, said “We have been instrumental in driving this standard with the MMA. With our tools the incorporation of MIDI with DLS will be seamless and above all easy to use by developers.”

U.S. Copyright Office opinion equates MIDI files with CDs and audio cassettes

The MMA, with assistance from multimedia industry leaders such as Thomas Dolby/Headspace, Microsoft, Apple, Yamaha and Kurzweil announced today at the Interactive Multimedia Association Expo that legal opinions from the U.S. Copyright Office state that MIDI files are subject to mechanical compulsory licenses when not accompanying a motion picture or other audiovisual work.

This groundbreaking and controversial decision will significantly lower publishers per unit licensing fees for MIDI recordings of musical works. At the same time, it will allow for a substantial increase in the number of published MIDI files, increasing publishers overall revenues and bringing MIDI into the mainstream consumer audio market.

“MIDI technology can dramatically improve music education, games and Internet applications,” said Tom White, president of the MMA. “But until now, licensing for audio-only MIDI files has been difficult and expensive.”

According to Charlotte Douglass, principal legal advisor to the general counsel, United States Copyright Office: “The Office still considers the media upon which aural sequences are recorded (unaccompanied by visual images) to be phonorecords and that such media are subject to a mechanical license or compulsory license under Section 115. The output of Standard MIDI files are works of authorship copyrightable as sound recordings since the information in the file causes the sound device to render the pitch, timbre, speed, duration and volume of the musical notes in a certain order, as does a player piano in conjunction with a piano roll, or a compact disc player in conjunction with a compact disc.”

“This opinion clarifies for everyone that MIDI files are no different from other forms of audio,” said Brian Ward, special counsel to the MMA. “This has been the critical missing link for growth in consumer interactive audio applications.”

While removing barriers to the use of MIDI data in many areas, this opinion still leaves unresolved other creative control issues affected by a compulsory license. “Our intent is to continue our dialogue with publishers and songwriters to help create solutions which will allow everyone to benefit,” noted White.

Comprised of over 140 hardware and software companies from various industries, the MIDI Manufacturers Association (MMA) is dedicated to improving and standardizing the capabilities and marketability of MIDI-based products. Membership includes leading companies from every application of audio and MIDI technology, including stage and theater, music performance, home and studio recording, multimedia computing, film and broadcast, and others.

The MMA’s SMF Copyright and Licensing Committee was formed with the assistance of charter members Roland and Yamaha to communicate the interests of the music products industry and its customers to music publishers, artists, and copyright holders in hopes of developing a strong market for commercial MIDI files.

Manufacturers Unite to Promote Licensing of MIDI Files (1/96)

The MIDI Manufacturers Association (MMA) has announced an industry initiative to promote the licensing of commercial music in Standard MIDI file (SMF) format. The “SMF Copyright and Licensing Committee” was formed — with the assistance of charter members Roland and Yamaha — to communicate the interests of the music products industry and its customers to music publishers, artists, and copyright holders in hopes of developing a strong market for commercial MIDI files.

This effort is designed to replicate the market which exists in Japan and Europe, where floppy disks of music are treated just like audio CDs and cassettes, and are sold shrink-wrapped off store shelves for prices similar to CDs. In these markets there is also substantial secondary revenue from related services such as magazines and even on-line services devoted to hobbyists and casual listeners.

“To a great extent, the problem in the US and North America is that there is no standard set of laws and practices governing MIDI recordings (SMFs)”, said Tom White, MMA President and CEO. “This has severely hampered the development of several MIDI-related markets, including the sale of MIDI scores and instruments for rehearsal, live performance, karaoke, computer hobbyists, and end-user entertainment”.

Japan and parts of Europe enjoy healthy markets in the areas of MIDI music data sales, where a single mail order outlet can sell up to 10,000 disks a month. Desktop Music (DTM) sales in Japan hit $35 million in 1994 and were projected to reach $50 million in 1995. What’s more, 70% of the business is currently going to first-time buyers, and home computer sales are just starting to explode in Japan.

The MMA initiative will include lobbying for MIDI recordings to enjoy the same status as audio recordings for licensing and copyright protections. MMA Special Counsel for Intellectual Property, Brian Ward, is leading this effort. Equally important is an educational effort, aimed at record labels, music publishers, online content providers, and recording artists, to help them understand the growing interest in MIDI files and how this can be good for business. “The current confusion surrounding the application of MIDI in these markets is blocking its use, and in some cases, the licensing fees requested just don’t support a viable business model for these markets”, said Ward. “At the same time, the MMA must understand and address the legitimate concerns of rights-holders and develop recommended practices and new MIDI protocols if necessary to protect those rights”.

MMA publishes GM Survey and Developer Guidelines

The MMA Executive Board has completed a comprehensive survey of existing GM hardware and software in order to determine what level of consistency exists in current GM implementations. This data clarifies what is required to be GM compatible based on what products exist today.

“The GM specification wasn’t written as an instruction manual … it’s more like a road map” said Tom White, MMA President. “So we have a situation where companies don’t understand how to implement certain features, which aren’t detailed in the specification. We are responding by making specific recommendations which will help developers be compatible with the majority of products in use today.”

The survey covers synthesizers (receivers), sequencers (players), and scores(content). The data is compiled and reported along with objective and subjective evaluations, designed to identify potential problem areas and recommended actions for best compatibility. The GM Developer Guidelines and survey is available to MMA members and other interested parties.

NAMM & MMA agree to develop MIDI Education Program

​NAMM has agreed to co-sponsor a program to be developed by the MMA, which will help grow the market for MIDI through dealer training and better understading of end-user needs. The program intends to create new customers for MIDI products, by demonstrating the benefits of MIDI without requiring the user to learn the technology. The dealer certification program will be aimed squarely at the millions of PC/multimedia system owners, who are looking for more to do with their computers, and will focus the salesperson to demonstrate how MIDI technology can be applied to meet a customers musical interests. Training will include topics such as General MIDI, home recording, song writing, education and entertainment. Retailers will receive certificates identifying each graduate of the course on their staff as MMA Certified Instructors, and will be eligible to teach a similar course to their customers if desired.

MMA PARTNERS WITH THE IMA FOR EXPO ’96

The MIDI Manufacturers Association (MMA) will join with the Interactive Multimedia Association as a major participant in IMA Expo ’96, to be held in New York City’s Javits Convention Center, September 17-19, 1996. The Expo will provide a comprehensive forum for the diverse and rapidly growing $18 billion multimedia industry, which includes the CD-ROM, Internet delivery, on-line, broadband, and enterprise network systems markets.

“Computer and multimedia applications have become extremely important to a major segment of our membership,” said MMA President Tom White. “

“We see a strong synergy between our organizations and major opportunity to combine forces for the benefit of both memberships,” said IMA President Philip V. W. Dodds. “The MMA’s participation in IMA Expo ’96 will significantly enhance our show. Music and audio are major elements in the multimedia equation, and the MMA, with its roots in the music industry, is rapidly becoming a key player within the multimedia industry.”

As part of the MMA’s participation, Tom White will join IMA Expo’s conference committee, working closely with the IMA to provide input on conference program content related to audio. White said “Together, our organizations will work to develop a program track which will educate and inform developers and producers on the tools available for audio production, as well as on new technologies and directions which promise to change how content is made and distributed.”

Position Statement on “XMIDI”Source: MMA Technical Standards Board (c1996)

​In response to magazine or online publications claiming that a new technology called “XMIDI” is poised to become a new standard for synthesizers, the MMA Technical Standards Board of Directors has released the following statement: We genuinely applaud the effort of the developer for attempting to make MIDI into a different and, in their eyes, better technology. However, despite some very clever engineering on the part of the developer, extensive review and discussion by the MMA’s Technical Standards Board and many of our members indicates that XMIDI would create more problems than it would solve for the vast majority of current and future users of MIDI. What follows are four main reasons which have led to this conclusion:1) MIDI is inexpensive and royalty free. These characteristics are considered vital to our membership and a prime reason for its acceptance and proliferation. A custom hardware solution from a single source would represent a 180 degree change in direction.2) The non-orthogonality of the XMIDI interface makes it extremely difficult to write manageable software to parse it, and more importantly, to relate it to the user in an non-confusing manner. MIDI is now being evaluated for adoption in a number of high-volume markets where design simplicity is crucial. The Tech Board feels that introducing anything that risks increasing design difficulty and user confusion would compromise both the interests of the greater MMA membership and our customers.3) The MIDI Specification is open for everyone to use. The requirement of secrecy agreements for each licensee of XMIDI is unacceptable. MIDI is based on the spirit of cooperation and consensus. Secrecy agreements would completely undermine this spirit.4) The MMA membership has indicated many times that enhancements to MIDI should not increase the amount of data traffic on the 31.25 Kbaud serial line. In our opinion XMIDI would clearly increase traffic a great deal, adding to the current problems of MIDI response time with dense controller activity.In conclusion, we once again express our interest in any effort to design a low cost, high speed MIDI alternative that would be royalty and copyright free. We believe a design with such characteristics would be warmly welcomed by the MMA membership. We do not believe that XMIDI meets these requirements.

About MIDI-Part 1:Overview

MIDI (pronounced “mid-e”) is a technology that makes creating, playing, or just learning about music easier and more rewarding. Playing a musical instrument can provide a lifetime of enjoyment and friendship. Whether your goal is to play in a band, or you just want to perform privately in your home, or you want to develop your skills as a music composer or arranger, MIDI can help.

How Does MIDI Work?

There are many different kinds of devices that use MIDI, from cell phones to digital music instruments to personal computers. The one thing all MIDI devices have in common is that they speak the “language” of MIDI. This language describes the process of playing music in much the same manner as sheet music: there are MIDI Messages that describe what notes are to be played and for how long, as well as the tempo, which instruments are to be played, and at what relative volumes.

MIDI is not audio.

MIDI is not audio. So if someone says MIDI sounds bad, they really don’t understand how MIDI works. Imagine if you took sheet music of a work by Beethoven and handed it to someone who can read music, but has never played the violin. Then you  put in their hands a very cheap violin. The music would probably sound bad. Now take that same piece of sheet music and hand it to the first chair of a symphony orchestra playing a Stradivarius and it will sound wonderful. So MIDI depends on the quality of playback device and also also how well the description of the music fits that player.

MIDI is flexible

The fact that MIDI is a descriptive language provides tremendous flexibility.Because MIDI data is only performance instructions and not a digital version of a sound recording, it is actually possible to change the performance, whether that means changing just one note played incorrectly, or changing all of them to perform the song in an entirely new key or at a different tempo, or on different instruments.

MIDI data can be transmitted between MIDI-compatible musical instruments, or stored in a Standard MIDI File for later playback. In either case, the resulting performance will depend on how the receiving device interprets the performance instructions, just as it would in the case of a human performer reading sheet music. The ability to fix, change, add, remove, speed up or slow down any part of a musical performance is exactly why MIDI is so valuable for creating, playing and learning about music.

The Three Parts of MIDI

The original Musical Instrument Digital Interface (MIDI) specification defined a physical connector and message format for connecting devices and controlling them in “real time”. A few years later Standard MIDI Files were developed as a storage format so performance information could be recalled at a later date. The three parts of MIDI are often just referred to as “MIDI “, even though they are distinctly different parts with different characteristics.

1. The MIDI Messages – the software protocol

The MIDI Messages specification (or “MIDI Protocol”) is the most important part of MIDI. The protocol is made up of  the MIDI messages that describe the music. There are note messages that tell the MIDI devices what note to play, there are velocity messages that tell the MIDI device how loud to play the note, there are messages to define how bright, long or short a note will be.There are Program Change messages that tell the MIDI device what instrument to play.So by studying and understanding MIDI messages you can learn how to completely describe a piece of music digitally. Look for information about MIDI messages in the “Control” section of Resources.

2. The physical transports for MIDI

Though originally intended just for use with the MIDI DIN transport as a means to connect two keyboards, MIDI messages are now used inside computers and cell phones to generate music, and transported over any number of professional and consumer interfaces (USB, Bluetooth, FireWire, etc.) to a wide variety of MIDI-equipped devices.

There are many different Cables & Connectors that are used to transport MIDI data between devices. Look for specific information in the “Connect” section of Resources.

MIDI is not slow

The “MIDI DIN” transport causes some confusion because it has specific characteristics which some people associate as characteristics of “MIDI” — forgetting that the MIDI-DIN characteristics go away when using MIDI over other transports (and inside a computer). With computers a High Speed Serial, USB or FireWire connection is more common. USB MIDI is significantly faster than 5 pin DIN. Each transport has its own performance characteristics that might make some difference in specific applications, but in general the transport is the least important part of MIDI, as long as it allows you to connect all the devices you want use!

3. The file formats for MIDI files

The final part of MIDI is made up of the Standard MIDI Files (and variants), which are used to distribute music playable on MIDI players of both the hardware and software variety. All popular computer platforms can play MIDI files (*.mid) and there are thousands of web sites offering files for sale or even for free. Anyone can make a MIDI file using commercial (or free) software that is readily available, and many people do. Whether or not you like a specific MIDI file can depend on how well it was created, and how accurately your synthesizer plays the file… not all synthesizers are the same, and unless yours is similar to that of the file composer, what you hear may not be at all what he or she intended. Look in the “Create”section of Resources for information about how to create and use different MIDI file formats.

Even More MIDI

Many people today see MIDI as a way to accomplish something, rather than as a protocol, cable, or file format. For example, many musicians will say they “use MIDI”, “compose in MIDI” or “create MIDI parts”, which means they are sequencing MIDI events for playback via a synthesizer, rather than recording the audio that the synthesizer creates.

About MIDI-Part 2:MIDI Cables & Connectors

Part 2: MIDI Cables & Connectors

Many different “transports” can be used for MIDI messages. The speed of the transport determines how much MIDI data can be carried, and how quickly it will be received.

5-Pin MIDI DIN

Using a 5-pin “DIN” connector, the MIDI DIN transport was developed back in 1983, so it is slow compared to common high-speed digital transports available today, like USB, FireWire, and Ethernet. But MIDI-DIN is almost always still used on a lot of MIDI-equipped devices because it adequately handles communication speed for one device. Also, if you want to connect one MIDI device to another (without a computer), MIDI cables are usually needed.

USB and FireWire

Computers are most often equipped with USB and possibly FireWire connectors, and these are now the most common means of connecting MIDI devices to computers (using appropriate adapters). Adapters can be as simple as a short cable with USB or FireWire connectors on one end and MIDI DIN connectors on the other, or as complex as a 19 inch rack mountable processor with dozens of MIDI and Audio In and Out ports. The best part is that USB and FireWire are “plug-and-play” interfaces which means they generally configure themselves. In most cases, all you need to do is plug in your USB or FireWire MIDI interface and boot up some MIDI software and off you go.

With USB technology, devices must connect to a host (PC), so it is not possible to connect two USB MIDI devices to each other as it is with two MIDI DIN devices. (This could change sometime in the future with new versions of USB). USB-MIDI devices require a “driver” on the PC that knows how the device sends/receives MIDI messages over USB. Most devices follow a specification (“class”) that was defined by the USB-IF; Windows and Mac PCs already come with “class compliant” drivers for devices that follow the USB-IF MIDI specification. For more details, see the article on USB in the Connection area of Resources. 

Most FireWire MIDI devices also connect directly to a PC with a host device driver, and the host handles communication between FireWire MIDI devices even if they use different drivers. But FireWire also supports “peer-to-peer” connections, so MMA (along with the 1394TA) produced a specification for transport of MIDI over IEEE-1394 (FireWire), which is available for download on this site (and also part of the IEC-61883 international standard).

Ethernet & WiFi (LAN)

Many people have multiple MIDI instruments and one or more computers (or a desktop computer and a mobile device like an iPad), and would like to connect them all over a local area network (LAN). However, Ethernet and WiFi LANs do not always guarantee on-time delivery of MIDI messages, so MMA has been reluctant to endorse LANs as a recommended alternative to MIDI DIN, USB, and FireWire. That said, there are many LAN-based solutions for MIDI, the most popular being the RTP-MIDI specification which was developed at the IETF in cooperation with MMA Members and the MMA Technical Standards Board. In anticipation of increased use of LANs for audio/video in the future, MMA is also working on recommendations for transporting MIDI using new solutions like the IEEE-1722 Transport Protocol for Time-Sensitive Streams.

Bluetooth

Everything is becoming “mobile”, and music creation is no exception. There are hundreds of music-making software applications for tablets and smart phones, many of which are equipped with Bluetooth “LE” (aka “Smart”) wireless connections. Though Bluetooth technology is similar to WiFi in that it can not always guarantee timely delivery of MIDI data, in some devices Bluetooth takes less battery power to operate than WiFi, and in most cases will be less likely to encounter interference from other devices (because Bluetooth is designed for short distance communication). In 2015 the MMA adopted Bluetooth MIDI LE performance and developed a recommended practice (specification) for MIDI over Bluetooth.

Deprecated Solutions

Sound Cards

It used to be that connecting a MIDI device to a computer meant installing a “sound card” or “MIDI interface” in order to have a MIDI DIN connector on the computer. Because of space limitations, most such cards did not have actual 5-Pin DIN connectors on the card, but provided a special cable with 5-Pin DINs (In and Out) on one end (often connected to the “joystick port”). All such cards need “driver” software to make the MIDI connection work, but there are a few standards that companies follow, including “MPU-401” and “SoundBlaster”. Even with those standards, however, making MIDI work could be a major task. Over a number of years the components of the typical sound card and MIDI interface (including the joystick port) became standard on the motherboard of most PCs, but this did not make configuring them any easier.

Serial, Parallel, and Joystick Ports

Before USB and FireWire, personal computers were all generally equipped with serial, parallel, and (possibly) joystick ports, all of which have been used for connecting MIDI-equipped instruments (through special adapters). Though not always faster than MIDI-DIN, these connectors were already available on computers and that made them an economical alternative to add-on cards, with the added benefit that in general they already worked and did not need special configuration. The High Speed Serial Ports such as the “mini-DIN” ports available on early Macintosh computers support communication speeds roughly 20 times faster than MIDI-DIN, making it also possible for companies to develop and market “multiport” MIDI interfaces that allowed connecting multiple MIDI-DINs to one computer. In this manner it became possible to have the computer address many different MIDI-equipped devices at the same time. Recent multi-port MIDI interfaces use even faster USB or FireWire ports to connect to the computer.

Animusic-MIDI-Driven Computer Animation

Animusic produces innovative music animation by leveraging MIDI data in creating “virtual concerts”. The animation of graphical instrument elements is generated using proprietary software called MIDImotionTM. The technique is analytical, note-based, and involves a pre-process (as opposed to being reactive, sound-based, and real-time).

Feature: Interview with Wayne Lytle, from Animusic

Animusic , as the name implies, is visualized or animated music; the brainchild of one Wayne Lytle, Columbia Music major turned animator. The Animusic DVDs feature computer-animations of preposterous-looking instruments – part fairground organ, part Disney’s Fantasia – that ‘play’ various pieces of music from techno to classical.The brains behind the technology is MIDI. MIDI allows the music and the animations to work, control, and be controlled in sync. Initially Lytle used off the shelf animation and music applications but in recent years the business has grown sufficiently for the company to invest in their own, purpose-designed software, Animotion.

How did this get started?

WL: I got into synths before MIDI, back in seventies, with things like the MiniMoog and all that stuff. By the early 1980s I had been vaguely aware of MIDI and I had already started drawing sketches in notebooks about how I wanted to drive animation from music. Then a friend told me about me about this thing called MIDI, and how you can have synths and computers talking to each other, and suddenly my two worlds merged together. It’s been a happy place ever since.

It wasn’t really until1989 that I first started experimenting with animation being driven from music, and analyzing the MIDI files that I would read in from whichever sequencer I was using.

Are you a keyboard player by nature?

WL: I started with piano and drums. I actually studied classical piano in college but I wasn’t really very good. I was more interested in playing with my bands. I’m not particularly accurate so I’m very grateful for MIDI and sequencing. The keyboard doesn’t get in my way. I ‘think’ the notes and then figure out a way to get them in there. The keyboard may or may not be involved. Thanks goodness I don’t have to play correctly; I just have to think it.

How many people work on projects?

WL: We have three people on our core production staff with a couple of other freelancers who do background projects, skies. We have one full-time software developer and then Dave and I are the core that cook up the ideas, and model them and write the music. On the one hand it’s hard to be small but it’s also hard to be big. On the one hand we don’t have any major communication issues, but at the same time with so few people it’s easy to get burned out.

How long does a project take?

WL: Each of the first two DVDs took three years. We’re working towards being able to do one a year, which is why we developed the software, to make a more streamlined pipeline. We want to get to a point where we’re spending more time playing with the instrument than writing and testing the code.

Are you an animator who plays or a musician who animates?

WL: That’s the tough question I’ve asked myself! In fact worse: Am I a musician who programs or a programmer who’s trying to be a musician? I guess for a long time I wondered how was I going to get good at any of these things. I felt that I needed to just pick one and focus on that. But I didn’t and at this time they do seem to have merged nicely together.

It’s certainly a lot of fun and a wonderful thing to be able to do what’s your passion and for it be able to support a company. We feel very fortunate.

Is there any underlying motive or message behind what you’re doing?

WL: That’s a great question I don’t get asked much. You’ve really touched on something. When we started out it was just a personal passion; what I wanted to do and what I thought about all day long. It certainly started with my own personal interest. But from there when the first DVD was released, and we could see people’s reaction to it, then our motives and purpose began to broaden quite a bit. It did really seem to bring joy to people, to make them smile, make them happy, mesmerize them even. It does seem to affect some people very deeply – kids, and older people, right across the spectrum. That changed our motivation somewhat. There’s even special education. We have Special Needs teachers using our work for education, helping with everything from musical timing, to math, to social interaction.

None of that was expected. It was kind of a happy surprise. Now our motivation to go and kindle those things. We don’t really have an agenda that we’re out to educate per se, but we do want to contribute positively to the electronic media and entertainment world that, at the moment, we see filled with a lot of poor quality, or violent, or plain disgusting content. We’re trying to show a positive side by producing something that’s different and cool, without being silly or corny.

Which comes first – the visuals or the music?

WL: Yes and yes. We’ve done it both ways. With certain animations I have the entire music written sequenced and mixed before even we’ve even thought about animating anything. Other times we have designed and built and tested the graphical instruments before we write any music for them.

In the most ideal sense – and certainly the approach we’re taking with Animusic 3 – it’s really something we try to do in tandem, where we’re working on building the instrument the same time as we’re learning how it will play better. What is it capable of, will it play better faster, or is it better at slow, plodding riffs and basslines? Are there too many notes and do we need to take some out and have it be this 8-note bass machine, or can it handle having 40 different notes? Then, as that evolves, the musical palette evolves and perhaps even stuff we’re doing in the music influences the design of the instrument. Ideally it is a much more integrated process rather than one coming completely before the other.

Have you worked specifically with any of the DAW manufacturers?

WL: Not yet. We try to just focus on the content. Our product is the DVD themselves rather than the tools, although it’s not out of the question that at some point we’d do that. We’ve actually gone away from using commercial sequencers. Not that they’re not great – they get greater and greater as times goes on – but a year or two ago I finally got to the point where I decided to write my own sequencer [MIDIMotion] that could integrate directly with the music animation stuff so that animating and sequencing become a more unified process; where they’re almost one and the same – where you could say you were sequencing the animation or animating the music.

Is MIDIMotion available for sale yet?

WL: It’s been discussed but we haven’t done that yet. It’s quite an undertaking to release a price of software and support it well so up to this point it’s just been an in-house set of software tools.

We have this large animation sequencing program called Animusic Studio and that uses the MIDIMotion engine to do the music animation part of it.

What do you see as the strengths of plug-ins and software sounds as opposed to hardware synths?

WL: I for one am very happy about where we are now. I don’t have a lot of complaints. Nowadays you can have synths that sound fantastic, and that offer total recall, and can be used in as many instances as you like. In the old days you couldn’t just run out and buy six MiniMoogs. Now I can open, say, multiple Reason sessions open and just switch between whichever I’m working on at the moment. And there everything is, configured. Honestly I don’t find myself reminiscing about the old days too often. I’m happy about the new days.

I do however having fond memories of ELP at Madison Square Gardens watching them with mountains of gear and Emerson sticking knives into his Hammonds and stuff. That was fun too.

How does MIDI fit into the modern idiom of music making?

WL: The fact that it’s become as transparent as it has become is in itself an indication of its incredible success. Yes it’s powerful, but you don’t necessarily have to understand it. It’s not like the old modular synth days where if you couldn’t figure out what to hook up to what you were pretty much sunk. Now, a lot of it is going on in the background, invisibly. And even in computer stuff a lot of it is MIDI that may not even make into a physical cable at all but it’s still using the MIDI protocol and a lot of people are completely unaware of that’s what’s there.

I don’t really see much of a problem with that though it would be nice if there was little more evolution because clearly there are some things that are missing. I don’t see a whole lot of activity in pushing stuff forward. One thing I personally wishing for sometimes is the ability to glide from one note to another in some controllable way, sort of like a ribbon controller where you’re not snapping back to the old note. Using a pitch bend you can bend up a fifth, say, and then have to let it snap back and pick up from where you were. But that actually presents some problems graphically when bending, then having to go back and then going to a new note. I have to tell it ignore certain data and pretend like it was someone gliding up to another like on a fretless bass. I have to fight it.

Are the basics of MIDI Still useful for people to learn?

WL: Probably, but how far down do you want to get knowledgeable about your tools? You can get really at using your tools without necessarily knowing too much ‘about’ them. A painter may not necessarily know all about the wood his paintbrush is made from or where the bristles come. That won’t prevent him from being able to do great job at painting. But if you want, you can keep digging down another layer.

Obviously for the people who build music technology tools it’s critical that they understand MIDI, and not consider it to be frozen, never to be enhanced or pushed forward. And it never hurts anyone to understand what’s under the hood.

How much are you prepared to reveal about the processes you use?

WL: Right now we don’t really want to give away the recipe to our secret sauce but at certain point I think it’ll be important to share what goes on behind the scenes. It’s important for us to figure out how to do that so it’s clear, and doesn’t comes across as “wow that’s really complicated. Those guys are really smart.” That’s a reaction we can get sometimes and that’s not really the point. It’s cool what can be done but not as complex as it looks if you explain in right.

We’d like to share this in such as way as people can be empowered by MIDI. Maybe we’ll do that at the same time as we make the software available for people to use. Right now, though, our focus is still very much on the next DVD.

What do you think can be done by the ‘average’ person?

WL: As far as out-of-the-box goes, not a lot, probably. It’s like when synthesizers were operated by guys in lab coats at universities; plugging cords in… Nowadays everything can be done on a laptop. I think that’s how it should develop; it should become that way with music and animation, where people are just dragging and dropping and making their own instruments making new instruments pumping music into it. They don’t necessarily need to know the technology behind it. That’s a target. To put it in people’s hands as a simple and enjoyable process as opposed to crashing and dorking around every few minutes in order to make cool things happen.

-END-

MIDI Electronic Light Scape Device (“eLSD”)

“Manipulate the state of your mind” using Light Scape sequences. Enjoy the psychedelic color experience that the eLSD is inducing. It gently pulses light and color in front of your eyes, influencing your brain waves and the state of your mind. Create your own Light Scape sequences (The eLSD is a USB-MIDI device). 

We have seen lots of MIDI devices, but this has to be one of the most unique. 

Here what the inventor of the MIDI Googles says about the benefits.

Benefits for you:
Increase your creative potential – super charge the creative power of your mind – intensify your “awake” state – be in control of your creative energy – turn ON your creative mind when you need it … and be the best you can be !

Be in total control of the Light Scape experience – take your electronic Light Scape Device anywhere – be part of the new wave of mind expanding electronic devices – make your own personal Sound and Light Scape – share your Sound and Light Scapes with others on  www.eLSD.com.au   …  and let others participate on your creation.

Enjoy the psychedelic color experience the eLSD is inducing – train your mind for a lucid dreaming experience – slow down your brain waves before going to bed for a better sleep – use the electronic Light Scape Device as a relaxation and meditation aid or use it as a potential learning aid for accelerated learning of languages and more.

by www.eLSD.com.au

It sounds pretty whacky, but there is actually a good deal of hard science about Brain Entrainment


...

Edmonton Neurotherapy: Non-Invasive Brain Stimulation/Neuromodulation Therapies

Information on non-invasive brain stimulation/neuromodulation technologies employed at Edmonton Neurotherapy to treat psychoneurophysiological disorders. Includes information on audio-visual entrainment (AVE), cranial electrotherapy stimulation (CES), transcranial direct-current stimulation (tDCS), and bio-acoustical utilization device (BAUD) brain stimulation technologies.

MIDI’s Not Scary — Unless it’s Halloween!

DC Cemetery will be updated for 2017!

For years now the Brent Ross, Creator & Designer of the DC Cemetery has been creating some of best Halloween haunts by using all kinds of technology including MIDI. Here is an article form 2008 updated with the 2014 video and information on where you can see the 2017 version DC Cemetery. 

It’s the week before Halloween, and a line stretches down around a corner. Muffled sounds of screams drown out the nervous laughter of the people in line as it creeps forward. As you round the corner and head towards the house, the light in the trees casts an eerie shadow on the ground. You find it hard to contain your own inner-child’s excitement as you near the house. You can feel your pulse quicken and spine tingle as you see an mysterious fog floating low on the ground. You peer around the corner of the display only to witness a ghastly site: a putrefied skeleton rises from his coffin as nearby a ghost floats above.

As you take in the scene, suddenly a face rises from a vat of bubbling green goo… You walk on a bit and stifle a scream as you see a man struggling to escape from a crypt, being held back by skeletal occupants. Suddenly, the music swells and a 14-foot tall grim reaper stands up, waving his arms, and starts speaking… to you. You sub-consciously take a step backwards, as it is it all terrifyingly real.

Are we at the latest haunted house at Disneyland? Or perhaps one of those terrifying dark rides at Universal Studios? No, we’re in the Silicon Valley suburban enclave of Mountain View, California, home of Brent Ross, who works as at a financial advisor for most of the year. But come Halloween, he is the architect of an astonishingly realistic, fully animated display that creates massive lines that snake around the block and delights the thousands who attend. An Industrial Design major by training, Brent brings his considerable creativity to bear on his amazing Halloween displays, designing and building these spectacular haunts for the terror and enjoyment of the community.

Brett grew up like many kids excited about the notion of a good scare. He got his start making little scenes using a black light and a bowl of dry ice fog on his porch, but soon it evolved into a black plastic maze, then a wood framed cavern, and finally, the steel and wood monstrosity that covers the entire front yard of his parent’s house.

Although he delights in the reaction of the people who come to see his display, the real motivation for him is the challenge of creating an original display that is better and better each year. It’s a ton of work, and a huge strain on his family life, but the challenge of adding new aspects to the attraction – typically 1 or 2 new props and as many as 5 or 6 changes to the display – is simply too exciting to resist.

How impressive are his displays? Amazing enough to pocket the $50,000 first prize in a nation-wide Halloween display competition, that’s how. But even more amazing is the incredible way he goes about controlling this amazingly ambitious show… entirely from a single computer… using MIDI.

Most people think of MIDI as a music technology. And rightly so, since the “M” in MIDI does standard for Music, as in Music Instrument Digital Interface. However, many people don’t understand that there are several other “flavors” and uses of MIDI that allow you to do all sorts of things, including controlling an entire show like this one. In the case of Brent’s Halloween extravaganza, MIDI was something he stumbled upon one year after trying to synchronize a number of pneumatic actuators to control the motion of one of his Halloween props. A neighbor, Dave Fredrichs, stopped by while he was setting up and said “You should use MIDI for that”. His initial response was “What’s MIDI?” but once he experimented a bit with it, he realized that there was simply no better way to accomplish his ghoulish goals: “Dave to opened my eyes to the possibility of using MIDI, using real-time recording of musical notes and converting them into electrical signals that could then be used to turn pneumatic valves on and off. Before that day, I was hard-coding microcontrollers, something that was extremely difficult to do with multiple elements (props) running simultaneously, and then trying to sync audio to the mouth movements.”

The alternatives for Brett are far scarier than MIDI: to accomplish much the same functionality, he had to use dedicated industrial programmable controllers, which would control the pneumatic solenoids using a somewhat cryptic (no pun intended) step-by-step programming protocol, entering the duration and delays of every event manually using unsynchronized timers to control the sequences, and then being left with the daunting task of locking up all of this motion to the audio soundtrack that went along with the show.

The notion of using MIDI to control physical devices is actually quite well embraced in some markets. The MIDI “Show Control” protocol supports the automation of lighting, stage effects like fog machines, and other elements of stage craft. In Japan, MIDI protocol has even been adapted to control servo control motors in robots. And it all makes sense since MIDI is really just a descriptive protocol that describes a performance event.

MIDI describes these perfromance events by breaking down an action into separate parts: when a key is pressed on a keyboard, a MIDI message is generated that says which key was pressed (the MIDI note number / MIDI note on event), and how quickly it was pressed (the MIDI velocity), and in some cases how hard the key was pressed (MIDI aftertouch message). Some time later when you release the key, a MIDI Note Off message is generated, and any clever piece of software can easily determine the interval between the time the key was pressed and the key was released. Link a bunch of events together and you have a song… or a scare!

This “performance description language” is what makes MIDI so incredibly popular amongst musicians – you’ve recorded all the “mechanical” aspects of the performance, allowing you to go back later and manipulate those aspects of the performance, fixing notes, durations, velocity, or modifying the performance by adding real-time controllers, much like going through a rough draft document and making edits with your word processor.

In the context of Halloween, MIDI is the perfect tool for controlling performance of a different kind allowing you to individually control pneumatic solenoids using MIDI messages, letting you individually control each movement as well as editing the “performance” of those movements until you have the most natural and realistic motion possible. In Brett’s case, he uses MIDI on messages to close a pneumatic solenoid valve that then moves a cylinder by filling it with air from a compressor. When a Note Off message is received, then the solenoid opens and the air is exhausted, causing the cylinder to retract. By the same token MIDI messages can be used to turn on or off smoke machines or strobe lights… the possibilities really are endless, and entirely practical.

There is another aspect that makes MIDI attractive for using it beyond the paradigm of music performance, and that is the fact that MIDI technology and tools are well understood, easy to use, and remarkably affordable. In fact, it was the affordability of the solution that intrigued Brett once he understood what it could do for him: “The inspiration to convert to a MIDI-based control system came when I had a large Grim Reaper prop sitting in the garage for 2 years. I was saving up to buy a 50 output sequential microcontroller that costs over $2000 – and it didn’t have real time programming! Dave was inspired by the mechanical aspects of the prop and asked if he could “tinker” with the prop to get it running. He came back a week later with a cart loaded with a PC, MIDI keyboard, and a MIDI to parallel converter board mounted to a piece of plywood. Within an hour or so of wiring and playing with note assignments, the Reaper came to life and I was sold.”

The benefit of MIDI to Brett is that it enables him to use affordable “off-the-shelf” tools to orchestrate all this magic in his award winning shows. For example, the “central command” of his show is a garden-variety PC running Steinberg’s Cubase MIDI sequencing software.

Cubase software, or any MIDi sequencer for that matter, is designed to record, play, and edit a MIDI performance with extensive editing capabilities. Modern sequencers, called Workstations, also include recorded audio, turning the software into a comprehensive system for all aspects of a musical performance. Since the MIDI messages can be also used to control movements of characters instead of describe a musical performance, Brett is able to use the audio aspects of Cubase to manage the sound effects and soundtrack of the display, while using the MIDI aspects to control all the movements.

To control the hundreds of pneumatic cylinders throughout his display that make his creations move, Brett uses readily available “MIDI to Switch” technology that allows you to send MIDI messages to a relay board that closes switches in response to the particular “note on” or “note off” event. Although Brett uses control boards from a company called SD, other solutions are available such as from companies like Doepfer of Germany and MIDI Solutions in Canada.

“I recommend the SD MIDI converter boards to the point of where I now distribute their product for them on www.dcprops.com. I find the SD products are the most cost efficient and expandable of the solutions out there, and their tech support for these types of applications is far superior to others. Add to that the fact they are designed and manufactured here in California so shipping is a lot faster then the overseas alternatives. SD has also come up with a dimmer pack that expands the output so you can use it for dimming 110V lighting!”

Besides enabling him to use off-the shelf technology to control his shows, the simplicity of MIDI technology means his entire show is controlled from one rack of gear, allowing him to easily move his “control central” from place to place and store it in the off-season.

Brett’s rig consists of an enclosed, wheeled server rack with a series of 1U rack mount audio amplifiers (the blue boxes in the picture), multiple surge suppressors, a rack mount PC running Windows with two M-Audio DELTA 1010 and one 410 audio card, providing 24 mono audio outputs. The cards also provide the MIDI I/O to connect to the MIDI to switch converters that control the pneumatics via 24V solenoids.

Inside the slide mount drawer that houses the MIDI to Switch converter boards that provide a total of 256 24V DC outputs to drive the pneumatic solenoids, all powered by a 20-amp 24V DC power supply and battery backup power source. That was particularly important in the old days when we were blowing breakers every 20 minutes due to the overloading the electrical circuits with too much lighting and air compressors!

Even though Brett doesn’t use MIDI for making music, it certainly empowers his creativity while providing a reliable, cost-effective solution that delights the community year after year: “In my opinion there are many advantages to using MIDI, starting with giving you a centralized control hub and letting you easily perform a looped show. MIDI provides cost effective expansion capabilities, and the excellent real-time control provided by sequencers like Cubase combined with an easy to use interface makes programming the show very easy. Add to that the fact I have control over all aspects of automation and the soundtrack with support for multiple channels of audio and you can see why MIDI is a great solution for me.

There are challenges though. Comments Brent: “Since MIDI is designed to make use of a musical keyboard, using regular switches to trigger MIDI events is difficult, although he is working with SD Designs to create a “Switch to MIDI” converter to use on the show. Also, since the workstation sequencer runs on a computer, there are the standard issues of using a computer that presents its own set of issues when compared with the single-purpose industrial controllers. This is because no computer is ever as stable or reliable as a dedicated hardware device, but in this case the versatility of MIDI + audio makes up for it.”

MIDI may be a little scary to some people, but with a little effort you can easily learn to exploit all the power and convenience this technology has to offer. However, for some people, MIDI will always be scary, especially when it is enjoyed in the form of a remarkable Halloween display in Northern California. Cue the lightning sounds… and be aware of what lurks in the vat of bubbling ooze….

RTP-MIDI or MIDI over Networks

RTP-MIDI (IETF RFC 6295) is a specification for sending/receiving standard “MIDI 1.0” messages using standard networking protocols (“Real Time Protocol” and “Internet Protocol”). RTP-MIDI includes a data recovery mechanism (MIDI event journaling) to address packet loss that can occur on networks, eliminating the need for packet retransmission (which would increase latency and reduce throughput). A thorough description may be found here:

As an IETF standard, RTP-MIDI is not proprietary technology exclusive to any specific company, and is intended for use by anyone without obtaining a license or paying any royalties.

How do I use RTP-MIDI?

RTP-MIDI allows a standard network (wired or wireless) connection to be used for carrying MIDI data.

On Mac OSX and iOS, Apple’s CoreMIDI technology automatically handles the transfer of MIDI data between software apps (sequencers, softsynths, etc.) and external MIDI devices connected via USB (USB-MIDI), FireWire (1394-MIDI) and on a network (RTP-MIDI):

RTP-MIDI support for the Windows (desktop) MIDI API is available from Tobias Erichsen. Erichsen’s implementation is compatible with Apple’s:

Windows RT (the OS used on Windows tablets) does not yet have a MIDI API (it is still in development as of December 2013) but there is code available for developing apps that support RTP-MIDI:

RTP-MIDI can also easily be implemented in hardware such as keyboards, mixers, break-out boxes, etc., since the required memory footprint, processor usage, and hardware requirements are compatible with modern low-cost microcontrollers.

Is RTP-MIDI a replacement for MIDI 1.0 protocol?

RTP-MIDI uses and does not replace MIDI Protocol. RTP-MIDI allows the transport of MIDI 1.0 messages using standard networking hardware (as well as over the Internet), and is an alternative to MIDI DIN, USB-MIDI, and other transports which also use but do not replace MIDI protocol.

Compared to other transports for MIDI 1.0 messages, RTP-MIDI is especially useful for applications where MIDI devices are separated by long distances (for example in a studio, a live event, or a themepark) and anywhere that modern networking technologies are typically used.


...

RTP-MIDI – Wikipedia

RTP-MIDI is a protocol to transport MIDI messages within RTP (Real-time Protocol) packets over Ethernet and WiFi networks.


...

MIDI over Ethernet – Marvellous RTP-MIDI — iConnectivity

iConnectivity Ethernet-equipped MIDI interfaces use a system called RTP-MIDI (Real Time Protocol MIDI). This is a method of running MIDI over standard networks, like Ethernet or even Wi-fi! Yes, that’s right, you can even run MIDI over Wi-Fi!

rtpMIDI | Tobias Erichsen

The rtpMIDI-driver is a virtual MIDI-driver which allows DAW-applications to communicate via network with other computers.

Tutorial: MIDI and Music Synthesis

An explanation of music synthesis technology and how MIDI is used to generate and control sounds.

This document was originally published in 1995 at a time when MIDI had been used in electronic musical instruments for more than a decade, but was still a relatively new technology in the computer industry. All these years later, the PC industry has changed, and some of the explanations herein are now dated (such as references to “internal sound cards”, which have been replaced by external Analog to Digital converters). The explanations of MIDI technology and synthesis methods are still accurate.



File Name:
audio_midi
File Size:
3.2 mb

Download File

About Web MIDI

The Web MIDI API connects your MIDI gear directly to your browser. 
Your browser connects you to the rest of the world.

MIDI hardware support has been available for a long time in Windows, Mac OS, iOS and most computer/tablet/smart phone platforms through USB, WiFi and even Bluetooth interfaces. But until now, there has been no standard mechanism to use MIDI devices with a Web browser or browser-based Operating System.

The Web Audio Working Group of the W3C has designed the Web MIDI API to provide support for MIDI devices as a standard feature in Web browsers and operating systems across multiple hardware platforms.

Google has led the way to support the inclusion of MIDI in the Web platform, both contributing to the specification and by shipping the first implementation of the Web MIDI API (in Chrome v.43 for Windows, OSX, and Linux), continuing to demonstrate the company’s interest in helping musicians interact with music more easily using the Web.

Being able to connect to local MIDI hardware will increase the creation and distribution of music-making applications for PCs, tablets and smart phones. It also means that popular MIDI hardware can be used to control any kind of software in the browser (using physical buttons and knobs instead of on-screen sliders, for example).

For hardware device makers, instrument control panels and editor/librarians which previously needed to be produced in multiple versions can now be implemented once in HTML5, and consumers can run them on any Web device (tablet, computer, or smart phone) and even “live” over the Web.

And finally, since the browser is connected to the Internet, musicians can more easily share data and even connect music devices over a network. 

Where will Web MIDI take us?

Web MIDI has the potential to be one of the most disruptive music technologies in a long time, maybe as disruptive as MIDI was originally back in 1983. Did Dave Smith and Ikutaro Kakehachi envision a world with 2,6 billion MIDI enabled smart phones. Definitely not! 

Here is an interesting video of someone using a commercially available MIDI controller to play a browser based game. it just makes you think about all the possibilities in the future.

Here are some links to more Web MIDI resources.


...

BandLab: Music Starts Here

Develop your songs from inspiration to finished projects with our best-in-class Mix Editor. Pull from thousands of available beats and loops in our extensive library or connect an interface and record live audio.


...

Soundation — Make music online

Make music in your browser and collaborate with anyone on Soundation, a one-stop shop for audio samples, instruments, and effects.


...

Soundtrap Press – Images & Videos

Soundtrap is the first cloud-based audio recording platform to work across all operating systems, enabling users to co-create music anywhere in the world. The company is headquartered in Stockholm, Sweden. Soundtrap provides an easy-to-use music and audio creation platform for all levels of musical interest and abilities and is being used by the K-12 through higher-education markets. On December 2017, Soundtrap was acquired by Spotify. For more information, visit http://www.soundtrap.com

Soundmondo

Social Sound Sharing with Web MIDI for Yamaha Synths


...

Noteflight – Online Music Notation Software

Noteflight® is an online music writing application that lets you create, view, print and hear music notation with professional quality, right in your web browser.


...

Online piano lessons – Learn piano your way

Learn to play piano with online, interactive lessons and tutorials. Our in-depth courses will adapt and give you feedback. Play your first melody in minutes.


...

Play Drums Online – online rhythm game

Play drums online is an online rhythm game where you can learn to play along with the best songs. Set a new high score or practice your drum skills with your favorite artists and songs.


...

WebSynths : the browser-based microtonal midi instrument

websynths.com is a FREE, browser-based musical instrument, optimized for microtonal experimentation on multi-touch devices.

Here are some Web MIDI links for developers


...

Web MIDI API


Ryoya Kawai’s web music developers appsot site with information in Japanese, English and Chinese.


...

Web MIDI: Music and Show Control in the Browser – TangibleJS

Chrome 43 officially introduces an amazing new feature: MIDI in the browser! For fans of physical computing, this is big news. Take advantage of it!


...

Keith McMillen combines Leap Motion and Web MIDI –

Keith McMillen Instruments shared this short demo of gestural mixing, using their K-Mix programmable mixer, a Leap Motion controller and Web MIDI. This article has links to all the great Web MIDI articles on the KMI site. 

2015 China Web Audio/MIDI Application & Innovation Contest

August 2015 — This contest is a joint partnership between CMIA, MMA, Google, and AMEI, to sponsor the promotion of web based music and audio application development and innovations, by leveraging the latest development in web audio APIs and web MIDI APIs, to develop new products and services, and creating more business opportunities for the music industry.

1) The dates and arrangement of the contest:

  • From now to Sept. 18th: Learning and application development trial runs – we have put together some self-learning guide and code labs (see below);
  • Sept. 19th: Web audio and web MIDI application development hackathon;
  • Oct. 5th: Contest submission and judging;
  • Oct. 16th: Top winners of the app contest present their apps and demo them at the 2015 China Music Expo. Winners will receive the contest winning certificate and trophy, and a new MIDI device as the award.

2) Contest rules and selection criteria:

  • All applications must use the web audio and/or web MIDI APIs;
  • Application leverages MIDI hardware is highly desired;
  • Application areas can be anything in music application (include installation art), education, games, entertainment, business applications, etc.
  • Web based application must run inside Chrome;
  • No 3rd party closed or proprietary technologies should be used;

3) Contest participation registration:

If you would like to participate the contest, please enter your information below to register:

https://www.gdgdocs.org/a/google.com/forms/d/1AeWUwpmxVZKBlFiYc041_kx8EXLnQcOvW_QZw6dQ39M/viewform

More information about the contest and Hackathon will be sent to all who registered.

4) Learning resources for web audio and web MIDI application development:

1.As a collaboration between Yamaha and Google, there are two code labs for developers to use and to learn:

oWeb audio code lab (in Chinese): https://codelab.b0.upaiyun.com/webaudio/index.html?zh-cn (the original English codelab:https://webmusicdevelopers.appspot.com/codelabs/webaudio/index.html?en-us)

oWeb MIDI code lab (in Chinese): https://codelab.b0.upaiyun.com/x-webmidi/index.html?zh-cn (the original English codelab:https://webmusicdevelopers.appspot.com/codelabs/x-webmidi/index.html?en-us)

oApplication code for the Code Labs on Github: https://github.com/hanguokai/codelabs(the original English version: https://webmusicdevelopers.appspot.com/codelabs/x-webmidi/index.html?en-us)

oVarious articles, demo code samples, and references, from Google Developer Relations team:

2.Chris Wilson’s web audio API usage samples: http://webaudiodemos.appspot.com/

3.Eric Bidelman’s web audio article on HTML5Rocks (in Chinese):http://www.html5rocks.com/zh/tutorials/getusermedia/intro

4.Bill Luan’s web audio intro presentation deck at the 2014 China Music Expo:https://drive.google.com/file/d/0B7cCqqjbylzvSzhEUjdnZXNCVjQ/view?usp=sharing (at the end of the deck there are more demos and reference links)

5.Pages about MIDI on this website: products, tutorials.

5) Some selections of MIDI hardware for developers to consider:

Thanks!

Google Web Audio/MIDI Hackathon at Music China 2015

A special Web Audio/MIDI Hackathon was organized for web developers by Google at the Google Shanghai office. More than 70 developers from across China, from as far as Beijing and Zhangjiakou in northern China, and even a developer from the Netherlands, came to attend the event, which was lead by Ryoya Kawai and Encai Liu of Yamaha, to train the developers about Web Audio and Web MIDI APIs. Bill Luan from Google’s Developer Relations team provided developers with MIDI devices to use (courtesy of Google, Yamaha, Korg, and CME). At the Hackathon, Mr. Kawai and Mr. Liu answered many questions from developers. Some of the developers attending the event demonstrated existing applications, from music-making to art installations, all using MIDI and audio technologies, to inspire others for the development. Attendees were eager to get their application completed for the contest, which will be judged by music and web industry experts from Yamaha, MMA, and Google.

About MIDI-Part 4:MIDI Files

Standard MIDI Files (“SMF” or *.mid files)

 

Standard MIDI Files (“SMF” or *.mid files) are a popular source of music on the web, and for musicians performing in clubs who need a little extra accompaniment. The files contain all the MIDI instructions for notes, volumes, sounds, and even effects. The files are loaded into some form of ‘player’ (software or hardware), and the final sound is then produced by a sound-engine that is connected to or that forms part of the player.

One reason for the popularity of MIDI files is that, unlike digital audio files (.wav, .aiff, etc.) or even compact discs or cassettes, a MIDI file does not need to capture and store actual sounds. Instead, the MIDI file can be just a list of events which describe the specific steps that a soundcard or other playback device must take to generate ceratin sounds. This way, MIDI files are very much smaller than digital audio files, and the events are also editable, allowing the music to be rearranged, edited, even composed interactively, if desired.

All popular computer platforms can play MIDI files (*.mid) and there are thousands of web sites offering files for sale or even for free. Anyone can make a MIDI file using commercial (or free) software that is readily available, and many people do, with a wide variety of results.

Whether or not you like a specific MIDI file can depend on how well it was created, and how accurately your synthesizer plays the file… not all synthesizers are the same, and unless yours is similar to that of the file composer, what you hear may not be at all what he or she intended. General MIDI (GM) and GM2 both help address the issue of predictable playback from MIDI Files.

Formats 

The Standard MIDI File format is different from native MIDI protocol, because the events are time-stamped for playback in the proper sequence.

Standard MIDI Files come in two basic varieties: a Type 1 file, and a Type 0 file (a Type 2 was also specified originally but never really caught on, so we won’t spend any time discussing it here). In a Type 1 file individual parts are saved on different tracks within the sequence. In a Type 0 file everything is merged into a single track.  

Making SMFs

Musical performances are not usually created as SMFs; rather a composition is recorded using a sequencer such as Digital Performer, Cubase, Sonar etc. that saves MIDI data in it’s own format. However, most if not all sequencers a ‘Save As’ or ‘Export’ as a Standard MIDI File.

Compositions in SMF format can be created and played back using most DAW software (Cubase, Logic, Sonar, Performer, FL Studio, Ableton Live, GarageBand ( Type 1 SMF), and other MIDI software applications.  Many hardware products (digital pianos, synths and workstations) can also create and playback SMF files. Check the manual of the MIDI products you own to find out about their SMF capabilities. 

Setup Data

An SMF not only contains regular MIDI performance data – Channelized notes, lengths, pitch bend data etc – it also should have data (commonly referred to as a ‘header’) that contains additional set-up data (tempo, instrument selections per Channel, controller settings, etc.) as well as songinformation (copyright notices, composer, etc.).

How good, or true to its originally created state an SMF will sound can depend a lot on the header information. The header can exert control over the mix, effects, and even sound editing parameters in order to minimize inherent differences between one soundset and another. There is no standard set of data that you have to put in a header (indeed such data can also be placed in a spare ‘set-up’ bar in the body of the file itself) but generally speaking the more information you provide for the receiving sound device the more defined – and so, presumably, the more to your tastes – the results will be.

Depending upon the application you are using to create the file in the first place, header information may automatically be saved from within parameters set in the application, or may need to be manually placed in a ‘set-up’ bar before the music data commences.

Information that should be considered (per MIDI Channel) includes:

  • Bank Select (0=GM) / Program Change #
  • Reset All Controllers (not all devices may recognize this command so you may prefer to zero out or reset individual controllers)
  • Initial Volume (CC7) (standard level = 100)
  • Expression (CC11) (initial level set to 127)
  • Hold pedal (0 = off)
  • Pan (Center = 64)
  • Modulation (0)
  • Pitch bend range
  • Reverb (0 = off)
  • Chorus level (0 = off)

All files should also begin with a GM/GS/XG Reset message (if appropriate) and any other System Exclusive data that might be necessary to setup the target synthesizer. If RPNs or more detailed controller messages are being employed in the file these should also be reset or normalized in the header.

If you are inputting header data yourself it is advisable not to clump all such information together but rather space it out in intervals of 5-10 ticks. Certainly if a file is designed to be looped, having too much data play simultaneously will cause most playback devices to ‘choke, ‘ and throw off your timing. 

Download the MIDI 1.0 specification that includes Standard MIDI File details 


...

The Complete MIDI 1.0 Detailed Specification

THE MIDI ASSOCIATION, a global community of people who work, play and create with MIDI and the central repository of information about anything related to MIDI.

About MIDI-Part 3:MIDI Messages

Part 3: MIDI Messages

The MIDI Message specification (or “MIDI Protocol”) is probably the most important part of MIDI.

MIDI is a music description language in digital (binary) form. It was designed for use with keyboard-based musical instruments, so the message structure is oriented to performance events, such as picking a note and then striking it, or setting typical parameters available on electronic keyboards. For example, to sound a note in MIDI you send a “Note On” message, and then assign that note a “velocity”, which determines how loud it plays relative to other notes. You can also adjust the overall loudness of all the notes with a Channel Volume” message. Other MIDI messages include selecting which instrument sounds to use, stereo panning, and more.

The first specification (1983) did not define every possible “word” that can be spoken in MIDI , nor did it define every musical instruction that might be desired in an electronic performance. So over the past 20 or more years, companies have enhanced the original MIDI specification by defining additional performance control messages, and creating companion specifications which include:

  • MIDI Machine Control
  • MIDI Show Control
  • MIDI Time Code
  • General MIDI
  • Downloadable Sounds
  • Scalable Polyphony MIDI

Alternate Applications 

MIDI Machine Control and MIDI Show Control are interesting extensions because instead of addressing musical instruments they address studio recording equipment (tape decks etc) and theatrical control (lights, smoke machines, etc.).

MIDI is also being used for control of devices where standard messages have not been defined by MMA, such as with audio mixing console automation.

Different Kinds of MIDI Messages

A MIDI message is made up of an eight-bit status byte which is generally followed by one or two data bytes. There are a number of different types of MIDI messages. At the highest level, MIDI messages are classified as being either Channel Messages or System Messages. Channel messages are those which apply to a specific Channel, and the Channel number is included in the status byte for these messages. System messages are not Channel specific, and no Channel number is indicated in their status bytes.

Channel Messages may be further classified as being either Channel Voice Messages, or Mode Messages. Channel Voice Messages carry musical performance data, and these messages comprise most of the traffic in a typical MIDI data stream. Channel Mode messages affect the way a receiving instrument will respond to the Channel Voice messages.

Channel Voice Messages

Channel Voice Messages are used to send musical performance information. The messages in this category are the Note On, Note Off, Polyphonic Key Pressure, Channel Pressure, Pitch Bend Change, Program Change, and the Control Change messages.

Note On / Note Off / Velocity

In MIDI systems, the activation of a particular note and the release of the same note are considered as two separate events. When a key is pressed on a MIDI keyboard instrument or MIDI keyboard controller, the keyboard sends a Note On message on the MIDI OUT port. The keyboard may be set to transmit on any one of the sixteen logical MIDI channels, and the status byte for the Note On message will indicate the selected Channel number. The Note On status byte is followed by two data bytes, which specify key number (indicating which key was pressed) and velocity (how hard the key was pressed).

The key number is used in the receiving synthesizer to select which note should be played, and the velocity is normally used to control the amplitude of the note. When the key is released, the keyboard instrument or controller will send a Note Off message. The Note Off message also includes data bytes for the key number and for the velocity with which the key was released. The Note Off velocity information is normally ignored.

Aftertouch

Some MIDI keyboard instruments have the ability to sense the amount of pressure which is being applied to the keys while they are depressed. This pressure information, commonly called “aftertouch”, may be used to control some aspects of the sound produced by the synthesizer (vibrato, for example). If the keyboard has a pressure sensor for each key, then the resulting “polyphonic aftertouch” information would be sent in the form of Polyphonic Key Pressure messages. These messages include separate data bytes for key number and pressure amount. It is currently more common for keyboard instruments to sense only a single pressure level for the entire keyboard. This “Channel aftertouch” information is sent using the Channel Pressure message, which needs only one data byte to specify the pressure value.

Pitch Bend

The Pitch Bend Change message is normally sent from a keyboard instrument in response to changes in position of the pitch bend wheel. The pitch bend information is used to modify the pitch of sounds being played on a given Channel. The Pitch Bend message includes two data bytes to specify the pitch bend value. Two bytes are required to allow fine enough resolution to make pitch changes resulting from movement of the pitch bend wheel seem to occur in a continuous manner rather than in steps.

Program Change

The Program Change message is used to specify the type of instrument which should be used to play sounds on a given Channel. This message needs only one data byte which specifies the new program number.

Control Change

MIDI Control Change messages are used to control a wide variety of functions in a synthesizer. Control Change messages, like other MIDI Channel messages, should only affect the Channel number indicated in the status byte. The Control Change status byte is followed by one data byte indicating the “controller number”, and a second byte which specifies the “control value”. The controller number identifies which function of the synthesizer is to be controlled by the message. A complete list of assigned controllers is found in the MIDI 1.0 Detailed Specification.

– Bank Select

Controller number zero (with 32 as the LSB) is defined as the bank select. The bank select function is used in some synthesizers in conjunction with the MIDI Program Change message to expand the number of different instrument sounds which may be specified (the Program Change message alone allows selection of one of 128 possible program numbers). The additional sounds are selected by preceding the Program Change message with a Control Change message which specifies a new value for Controller zero and Controller 32, allowing 16,384 banks of 128 sound each.

Since the MIDI specification does not describe the manner in which a synthesizer’s banks are to be mapped to Bank Select messages, there is no standard way for a Bank Select message to select a specific synthesizer bank. Some manufacturers, such as Roland (with “GS”) and Yamaha (with “XG”) , have adopted their own practices to assure some standardization within their own product lines.

– RPN / NRPN

Controller number 6 (Data Entry), in conjunction with Controller numbers 96 (Data Increment), 97 (Data Decrement), 98 (Non-Registered Parameter Number LSB), 99 (Non-Registered Parameter Number MSB), 100 (Registered Parameter Number LSB), and 101 (Registered Parameter Number MSB), extend the number of controllers available via MIDI. Parameter data is transferred by first selecting the parameter number to be edited using controllers 98 and 99 or 100 and 101, and then adjusting the data value for that parameter using controller number 6, 96, or 97.

RPN and NRPN are typically used to send parameter data to a synthesizer in order to edit sound patches or other data. Registered parameters are those which have been assigned some particular function by the MIDI Manufacturers Association (MMA) and the Japan MIDI Standards Committee (JMSC). For example, there are Registered Parameter numbers assigned to control pitch bend sensitivity and master tuning for a synthesizer. Non-Registered parameters have not been assigned specific functions, and may be used for different functions by different manufacturers. Here again, Roland and Yamaha, among others, have adopted their own practices to assure some standardization.

Channel Mode Messages

Channel Mode messages (MIDI controller numbers 121 through 127) affect the way a synthesizer responds to MIDI data. Controller number 121 is used to reset all controllers. Controller number 122 is used to enable or disable Local Control (In a MIDI synthesizer which has it’s own keyboard, the functions of the keyboard controller and the synthesizer can be isolated by turning Local Control off). Controller numbers 124 through 127 are used to select between Omni Mode On or Off, and to select between the Mono Mode or Poly Mode of operation.

When Omni mode is On, the synthesizer will respond to incoming MIDI data on all channels. When Omni mode is Off, the synthesizer will only respond to MIDI messages on one Channel. When Poly mode is selected, incoming Note On messages are played polyphonically. This means that when multiple Note On messages are received, each note is assigned its own voice (subject to the number of voices available in the synthesizer). The result is that multiple notes are played at the same time. When Mono mode is selected, a single voice is assigned per MIDI Channel. This means that only one note can be played on a given Channel at a given time.

Most modern MIDI synthesizers will default to Omni On/Poly mode of operation. In this mode, the synthesizer will play note messages received on any MIDI Channel, and notes received on each Channel are played polyphonically. In the Omni Off/Poly mode of operation, the synthesizer will receive on a single Channel and play the notes received on this Channel polyphonically. This mode could be useful when several synthesizers are daisy-chained using MIDI THRU. In this case each synthesizer in the chain can be set to play one part (the MIDI data on one Channel), and ignore the information related to the other parts.

Note that a MIDI instrument has one MIDI Channel which is designated as its “Basic Channel”. The Basic Channel assignment may be hard-wired, or it may be selectable. Mode messages can only be received by an instrument on the Basic Channel.

System Messages

MIDI System Messages are classified as being System Common Messages, System Real Time Messages, or System Exclusive Messages. System Common messages are intended for all receivers in the system. System Real Time messages are used for synchronization between clock-based MIDI components. System Exclusive messages include a Manufacturer’s Identification (ID) code, and are used to transfer any number of data bytes in a format specified by the referenced manufacturer.

System Common Messages

The System Common Messages which are currently defined include MTC Quarter Frame, Song Select, Song Position Pointer, Tune Request, and End Of Exclusive (EOX). The MTC Quarter Frame message is part of the MIDI Time Code information used for synchronization of MIDI equipment and other equipment, such as audio or video tape machines.

The Song Select message is used with MIDI equipment, such as sequencers or drum machines, which can store and recall a number of different songs. The Song Position Pointer is used to set a sequencer to start playback of a song at some point other than at the beginning. The Song Position Pointer value is related to the number of MIDI clocks which would have elapsed between the beginning of the song and the desired point in the song. This message can only be used with equipment which recognizes MIDI System Real Time Messages (MIDI Sync).

The Tune Request message is generally used to request an analog synthesizer to retune its’ internal oscillators. This message is generally not needed with digital synthesizers.

The EOX message is used to flag the end of a System Exclusive message, which can include a variable number of data bytes.

System Real Time Messages

The MIDI System Real Time messages are used to synchronize all of the MIDI clock-based equipment within a system, such as sequencers and drum machines. Most of the System Real Time messages are normally ignored by keyboard instruments and synthesizers. To help ensure accurate timing, System Real Time messages are given priority over other messages, and these single-byte messages may occur anywhere in the data stream (a Real Time message may appear between the status byte and data byte of some other MIDI message).

The System Real Time messages are the Timing Clock, Start, Continue, Stop, Active Sensing, and the System Reset message. The Timing Clock message is the master clock which sets the tempo for playback of a sequence. The Timing Clock message is sent 24 times per quarter note. The Start, Continue, and Stop messages are used to control playback of the sequence.

The Active Sensing signal is used to help eliminate “stuck notes” which may occur if a MIDI cable is disconnected during playback of a MIDI sequence. Without Active Sensing, if a cable is disconnected during playback, then some notes may be left playing indefinitely because they have been activated by a Note On message, but the corresponding Note Off message will never be received.

The System Reset message, as the name implies, is used to reset and initialize any equipment which receives the message. This message is generally not sent automatically by transmitting devices, and must be initiated manually by a user.

System Exclusive Messages

System Exclusive messages may be used to send data such as patch parameters or sample data between MIDI devices. Manufacturers of MIDI equipment may define their own formats for System Exclusive data. Manufacturers are granted unique identification (ID) numbers by the MMA or the JMSC, and the manufacturer ID number is included as part of the System Exclusive message. The manufacturers ID is followed by any number of data bytes, and the data transmission is terminated with the EOX message. Manufacturers are required to publish the details of their System Exclusive data formats, and other manufacturers may freely utilize these formats, provided that they do not alter or utilize the format in a way which conflicts with the original manufacturers specifications.

Certain System Exclusive ID numbers are reserved for special protocols. Among these are the MIDI Sample Dump Standard, which is a System Exclusive data format defined in the MIDI specification for the transmission of sample data between MIDI devices, as well as MIDI Show Control and MIDI Machine Control.

Why MIDI Matters

MIDI is like air, it’s all around you, most of the time you can take it for granted, but if you are a digital musician you probably couldn’t live with out it. MIDI is inside musical instruments, computers, tablets, smart phones, stage lighting, audio mixers, and many other products from well-known international companies including Apple, Gibson, Google, Korg, Microsoft, Roland, Yamaha, and hundreds more. By the fall of 2015, it’s projected that they will be around 2.6 billion devices on the planet that are MIDI enabled!

MIDI is almost always an integral part of the modern recording process for popular music and the music that is made for films, TV, and even video games. MIDI-equipped electronic keyboards are connected to computers and mobile devices running Digital Audio workstation software that record both MIDI and Audio to produce music. With advances in computing power making digital software based instruments sound ever more realistic, the music you hear on TV, in radio ads and in movies is often recordings of MIDI instruments (both hardware instruments and virtual “soft synth” software rather than of dozens of acoustic instruments. Even the huge orchestral score playing behind that big-screen block buster almost always starts off as a MIDI mockup arrangement and then the composer uses MIDI to transcribe the parts into music notation.

More and more DJ gear has MIDI integrated into it and the last few years has seen a dramatic rise in the number of unique DJ MIDI controllers and controllerists- artists like Moldover that use MIDI to manipulate sound and loops via MIDI controllers.

Types of MIDI

MIDI technology has been around for more than 30 years, and first appeared in the form of a special 5-Pin DIN connector on the back of a keyboard that is used to connect to another keyboard or to a computer. Though 5-pin DIN connections are still used for making connections between standalone hardware digital instruments, over the years as computer technology had developed and advanced so has MIDI.

Your computer, tablet, or smart phone probably has MIDI capabilities and starting in 2015, the web browser you are using to view this page may be capable of making music with MIDI. Google’s Chrome browser (and others to follow) can access local MIDI devices (hardware and software synthesizers, external keyboards, etc.) and use them for producing music and/or for controlling objects in the browser (such as a browser-based synthesizer).Web app developers are rapidly adding

MIDI support to existing apps and creating new apps that are Web-MIDI enabled.

If you own an iPad or iPhone you probably know that there are hundreds of programs you can get for making music (even on-the-go) that use Apple’s CoreMIDI technology to connect to MIDI devices.CoreMIDI also allows an iPad/iPhone to send/receive MIDI messages via Wi-Fi allowing the iPad/iPhone to control and/or play sounds and sequences on the Mac, and vice-versa.CoreMIDI also supports MIDI connections via USB, Ethernet, and FireWire (where equipped). Windows users can also connect to Macs and iOS devices via a third party RTP-MIDI.

With Android M, Google has added robust MIDI support for the billions of Android devices that are available making MIDI one of the most ubiquitous technologies on the planet.

Of course, MIDI is not just for keyboards… other MIDI-equipped musical instruments include digital drums, guitars, wind instruments, and more. For electronic dance music (EDM) and DJs, there are specialized controllers that use MIDI to trigger beats and loops, and to control lighting. And new kinds of digital musical instruments and controllers are being invented all the time, all of which integrate perfectly with existing instruments and devices because of MIDI.

Besides music creation, MIDI has some other interesting and popular uses. MIDI Show Control is a set of MIDI messages used for controlling lights and rides at theme parks as well as for operating themed events such as are found outside many Las Vegas casinos. And MIDI Machine Control provides remote transport control for many kinds of audio/video recording devices. Moreover, because MIDI is widely available and free to use, many people are able to develop unique DYI products using MIDI to control and/or generate sound… in all kinds of shapes and forms.

If you’d like to learn more about the amazingly diverse world of MIDI, this is the place as the site has videos, tutorials, forums and stories about MIDI artists.So join the community of people who make music and art with MIDI and learn how to get the most out of this incredibly flexible technology. 

Craig Anderton’s Brief History Of MIDI

The MIDI specification first saw the light of day at the 1981 AES, when Dave Smith of Sequential Circuits presented a paper on the “Universal Synthesizer Interface.” It was co-developed with other companies (an effort driven principally by Roland’s Ikutaro Kakehashi, a true visionary of this industry), and made its prime time debut at the 1983 Los Angeles NAMM show, where a Sequential Circuits Prophet-600 talked to a Roland keyboard over a small, 5-pin cable. I saw Dave Smith walking around the show and asked him about it. “It worked!” he said, clearly elated—but I think I detected some surprise in there as well.

The Prophet 600 and the Jupiter 6 at the 1983 Winter NAMM show

“It” was the Musical Instrument Digital Interface, known as MIDI. Back in those days, polyphonic synthesizers cost thousands of dollars (and “polyphonic” meant 8 voices, if you were lucky and of course, wealthy). The hot computer was a Commodore-64, with a whopping 64 kilobytes of memory—unheard of in a consumer machine (although a few years before, an upstart recording engineer named Roger Nichols was stuffing 1MB memory boards in a CompuPro S-100 computer to sample drum sounds). The cute little Macintosh hadn’t made its debut, and as impossible as it may seem today, the PC was a second-class citizen, licking its wounds after the disastrous introduction of IBM’s PCjr.

Tom Oberheim had introduced his brilliant System, which allowed a drum machine, sequencer, and synthesizer to talk together over a fast parallel bus. Tom feared that MIDI would be too slow. And I remember talking about MIDI at a Chinese restaurant with Dave Rossum of E-mu systems, who said “Why not just use Ethernet? It’s fast, it exists, and it’s only about $10 to implement.”

But Dave Smith had something else in mind: An interface so simple, inexpensive, and foolproof to implement that no manufacturer could refuse. Its virtues would be low cost, adequate performance, and ubiquity in not just the pro market, but the consumer one as well.

Bingo.

But it didn’t look like success was assured at the time; MIDI was derided by many pros who felt it was too slow, too limited, and just a passing fancy. 30 years later, though, MIDI has gone far beyond what anyone had envisioned, particularly with respect to the studio. No one foresaw MIDI being part of just about every computer (e.g., the General MIDI instrument sets). This trend actually originated on the Atari ST—the first computer with built-in MIDI ports as a standard item (see “Background: When Amy Met MIDI” toward the end of this article).

Evolution of a spec

Oddly, the MIDI spec officially remains at version 1.0, despite significant enhancements over the years: the Standard MIDI File format, MIDI Show Control (which runs the lights and other effects at Broadway shows like Miss Saigon and Tommy), MIDI Time Code to allow MIDI data to be time-stamped with SMPTE timing information, MIDI Machine Control for integration with studio gear, microtonal tuning standards, and a lot more. And the activity continues, as issues arise such as how best to transfer MIDI over USB, with smart phones, and over wireless.

The MIDI Manufacturers Association

The guardian of the spec, the MIDI Manufacturers Association (MMA), has stayed a steady course over the past several decades, holding together a coalition of mostly competing manufacturers with a degree of success that most organizations would find impossible to pull off. The early days of MIDI were a miracle: in an industry where trade secrets are jealously guarded, manufacturers who were intense rivals came together because they realized that if MIDI was successful, it would drive the industry to greater success. And they were right. The MMA has also helped educate users about MIDI, through books and online materials such as “An Introduction to MIDI.”

I had an assignment at the time from a computer magazine to write a story about MIDI. After turning it in, I received a call from the editor. He said the article was okay, but it seemed awfully partial to MIDI, and was unfair because it didn’t give equal time to competing protocols. I tried to explain that there were no competing protocols; even companies that had other systems, like Oberheim and Roland, dropped them in favor of MIDI. The poor editor had a really hard time wrapping his head around the concept of an entire industry willingly adopting a single specification. “But surely there must be alternatives.” All I could do was keep replying, “No, MIDI is it.” Even when we got off the phone, I’m convinced he was sure I was holding back information on MIDI’s competition.

MIDI HERE, MIDI THERE, MIDI EVERYWHERE

Now MIDI is everywhere. It’s on the least expensive home keyboards, and the most sophisticated studio gear. It’s a part of signal processors, guitars, keyboards, lighting rigs, smoke machines, audio interfaces…you name it. It has gone way beyond its original idea of allowing a separation of controller and sound generator, so people didn’t have to buy a keyboard every time they wanted a different sound.

SO WHERE’S IT GOING?

“Always in motion, the future…” Well, Yoda does have a point. But the key point about MIDI is that it’s a hardware/software protocol, not just one or the other. Already, the two occasionally take separate vacations. The MIDI data in your DAW that drives a soft synth doesn’t go through an opto-isolators or cables, but flies around inside your computer.

One reason why MIDI has lasted so long is because it’s a language that expresses musical parameters, and these haven’t changed much in several centuries. Notes are still notes, tempo is still tempo, and music continues to have dynamics. Songs start and end, and instruments use vibrato. As long as music is made the way it’s being made, the MIDI “language” will remain relevant, regardless of the “container” used to carry that data. However, MIDI is not resting on its laurels, and neither is the MMA—you can find out what they’re working on for the future here.

Background: When Amy Met MIDI

After MIDI took off, many people credited Atari with amazing foresight for making MIDI ports standard on their ST series of computers. But the inclusion of MIDI was actually a matter of practicality. Commodore was riding high with the C-64, in large part because of the SID (Sound Interface Device) custom IC, a very advanced audio chip for its time. (Incidentally, Bob Yannes, one of Ensoniq’s founders and also the driving force behind the Mirage sampler, played the dominant role in SID’s development.)

Atari knew that if it wanted to encroach on Commodore’s turf, they needed something better than SID. They designed an extremely ambitious sound chip, code-named Amy, that was supposed to be a “Commodore killer.” But Amy was a temperamental girl, and Atari was never able to get good enough yields to manufacturer the chips economically.

An engineer suggested putting a MIDI port on the machine, so it could drive an external sound generator; then they wouldn’t have to worry about an onboard sound chip. Although this solved the immediate Amy problem, it also turned out to be a fortuitous decision: Atari dominated the European music-making market for years, and a significant chunk of the US market as well. To this day, a hardy band of musicians still use their aging ST and TT series Atari computers because of the exceptionally tight MIDI timing – a result of integrating MIDI into the core of the operating system.

MIDI History: Chapter 4-Synths Come of Age 1900-1963

The first electronic musical instruments

As electricity became more widely available, the early 20th century saw the invention of electronic musical instruments including the Telharmonium, Trautonium, Ondes Martenot, the Theremin and the Hammond organ.

What is interesting in looking at these early devices is how much they foreshadow the future of modern music production.  

The Teleharmonium foreshadows both the Hammond organ and music streaming services. 

The Theremin foreshadows many BLE MIDI controllers of today by allowing gestural control of music without wires. 

The design goals that RCA had with their room sized modular synthesiser foreshadows current advances in AI technology, orchestral Software Synthesizers, and the recent resurgence in the popularity of modular synthesis. 

But the main thing that these early electronic musical instruments create is a path that would lead to the development of a digital standard for musical instruments- MIDI. 

For information on other early electronic musical instruments like the Trautonium and Ondes Martenot and for more details on the Telharmonium, Theremin and Hammond organ, please visit the website below. 


Special thanks to 120 years

120years is an amazing website run as a labor of love by Simon Crab and is a fantastic resource for information on early electronic instruments.  We encourage you to visit the site and donate to support it. Many of the images and information below are from Simon’s website and we wanted make sure he got proper credit. 

Lee de Forest’s ‘Audion’ triode vacuum tube valve of 1906 – the first ‘electronic’ audio oscillator.

120 Years of Electronic Music – The history of electronic musical instruments from 1800 to 2019


The Teleharmonium- 1897

The switchboard and tone circuits of the Teleharmoniun MkII- Courtesy of 120Years

The Teleharmonium was huge. You can get an idea of how big it is by this picture of a small boy next to one of the Teleharmonium’s tone wheels. It had to be that big and use incredible amounts of electrical power because the amplifier had not been invented yet. 

A single tone wheel generator with eight alternators.

The switchboard and tone circuits of the MkII

The world’s first music streaming services

The Teleharmonium was not only one of the first electronic musical instruments, it was also the first music streaming service. It actually streamed the synthesized music via wires.   This would eventually lead to its downfall.

Courtesy of Scientific American and 120Years
THE TELHARMONIUM – AN APPARATUS FOR THE ELECTRICAL GENERATION AND TRANSMISSION OF MUSIC.
Dr. Thaddeus Cahill’s system of generating music at a central station in the form of electrical oscillations, and of transmitting these oscillations by means of wires to any desired point, where they are rendered audible by means of an ordinary telephone receiver or a speaking arc, is now embodied in a working plant situated in the heart of New York.

Briefly summed up, Dr. Cahill’s wonderful invention consists in generating electrical oscillations corresponding in period with the acoustic vibrations of the various elemental tones desired, in synthesizing from these electrical vibrations the different notes and chords required, and in rendering the synthesized electrical vibrations audible by a translating device.

by Scientific American Vol 96 #10 9th March 1907

The reason for the failure of Cahill’s project were numerous.

The machine itself was impossibly expensive; about $1 million was spent from 1897 to 1914 on the project ( approximately $30 million in today’s value) and it was unlikely that a subscriber business model would have ever covered the costs, let alone make a profit – and, even if Cahill had found enough subscribers, he’d have to have built a power plant to generate enough power.

The output power of the Telharmonium caused great disruption of the New York Telephone network, angering telephone subscribers and even interrupting the Stock Exchange, which resulted eventually in AT&T refusing to co-operate in supporting Cahill’s instrument.

The Telharmonium was also a victim of an age of rapid technical advances; Lee De Forest began experimenting with radio transmissions as early as 1906 (which included transmissions of the Telharmonium) and by around 1914 wireless radio broadcasts had spelled the end for wire broadcasting.

by Simon Crab, 120 years

The Telharmonic Hall, New York City circa 1906

The Theremin

Leon Termen plays the ‘Theremin’ or ‘Thereminvox’ . Paris, 1927- Courtesy of 120years

The Theremin in pop music and culture

The Theremin was marketed and distributed in the USA by RCA during the 1930’s as a DIY kit form or as a finished instrument ( later aficionados of the instrument included Robert Moog who made and sold transistorised Theremins in the 1950s).

The heterodyning vacuum tube oscillator became the standard method of producing electronic sound until the advent of the transistor in the 1960’s and was widely used by electronic musical instrument designs of the period.

The Theremin became known in the USA as a home ‘novelty instrument’ and featured in many film soundtracks of the 1940-50’s, it also appeared in several pop records of the 1960’s but never overcame it’s novelty appeal; used for effect rather than as a ‘serious instrument’, most recordings employ the Theremin as a substitute string instrument rather than exploiting the microtonal and pitch characteristics of the instrument.

by Simon Crab, 120years


The ‘Rhythmicon’ Henry Cowell & Leon Termen. USA, 1930

In 1932, Leon Theremin developed another innovative electronic musical instrument – The Rhythmicon for composer Henry Cowell. The device could produce sixteen different rhythms and which were also transposable. This was the world’s first drum machine. 

It used cylindrical disks that in some ways are the same mechanism as music boxes or piano rolls. 

Demonstration of the third version of Rhythmicom built by Leon Theremin at Moscow State Conservatory in early 1960s. The first Rhythmicon was developed by Leon Theremin for Henry Cowell in 1932. It was the first rhythm machine ever built.

The Hammond Organ-1935

In 1935, Laurent Hammond, who had been a clock maker came up with a design and patented an “electrical musical instrument”.  It was based on the same technology in the Teleharmonium, but could be much smaller because of the invention of the amplifier. 

Thaddeus Cahill, developer of the Telharmonium, was . . . endowed with the ability to think big.

Although not strictly electronic (it predated the invention of the vacuum tube by about a decade), the Telharmonium embodied many basic principles that have been used in the electronic music medium: the generation of pitched tones from alternating electricity, the addition of harmonics to determine tone color (additive synthe- sis), and a touch sensitive keyboard to shape the sounds and control their strengths.

This first polyphonic, touch sensitive music synthesizer remained in service in New York for only a few years . . . The basic idea was resurrected again in the 1930s in an instrument that was somewhat more of a commercial success: the Hammond Organ

by Robert A. Moog,”ElectronicMusic,”Journal of the Audio Engineering Society, October/November 1977

The Hammond organ was a commercial success- first in churches and then in jazz clubs where musicians like Jimmy Smith made it popular. 

hammond B3 with Leslie 122
Hammond was inspired to create the tonewheel or “phonic wheel” by listening to the moving gears of his electric clocks and the tones produced by them.

by Wikipedia


The RCA Synthesiser I & II -1951

The RCA Mark II Synthesizer at the Columbia-Princeton Electronic Music Center at Columbia’s Prentis Hall on West 125th Street in 1958.
In the 1950’s RCA was one of the largest entertainment conglomerates in the United States; business interests included manufacturing record players, radio and electronic equipment (military and domestic – including the US version of the Theremin) as well as recording music and manufacturing records.

In the early 50’s RCA initiated a unusual research project whose aim was to auto-generate pop ‘hits’ by analysing thousands of music recordings; the plan being that if they could work out what made a hit a hit, they could re-use the formula and generate their own hit pop music.

The project’s side benefit also explored the possibility of cutting the costs of recording sessions by automating arrangements and using electronically generated sounds rather than expensive (and unionised) orchestras; basically, creating music straight from score to disc without error or re-takes.

by Simon Crab 120years

It’s interesting to view the quote from 120years in the context of recent advances in AI technology, DAWs and soft synth technologies .   The MIDI Association has covered many of the efforts to auto generate music by analysing recordings mainly because many companies use MIDI data as the input to train their AI models. On top of that, with the advent of VSTs and orchestral libraries,  more and more recording are created using virtual instruments. 

So MIDI has in many ways made RCA’s vision of the future come true. 

The RCA synthesizer is recognized as the first programmable polyphonic synthesizer.  It was programmed with coded punch paper rolls that look strangely similar to the paper rolls from player pianos and the piano roll editors found in almost every modern digital audio workstation  Below is a schematic of the modular elements of the RCA synthesizer that took up an an entire room in Columbia’s Prentis Hall. 


The Clavivox -1952

Raymond Scott with his invention the Clavivox

No discussion of electronic music would be complete without mentioning Raymond Scott.  

He was born in 1908 and in 1934 was hired as the staff pianist for the CBS Radio Orchestra.  Soon after he has his first hit as a composer “Christmas Night in Harlem”. 

In 1942, he broke the color barrier by forming the first racially-mixed network radio orchestra (including Ben Webster, Emmett Berry, Charlie Shavers, Cozy Cole, and others).

He is important in the history of electronics music for many, many reasons. 

He is credited with developing: 

  • The world’s first multi-track tape recorders (one with 7 tracks and 1one with 14 tracks) in 1952 
  • The world’s first electro-mechanical musical “sequencer” in 1953. 
  • He was granted a patent in 1956 for the Clavivox, a “Keyboard Operated Electrical Musical Instrument” which allowed you to slide between notes. Pretty amazing that this is 60 years before MPE!

Raymond Scott was hired by Berry Gordy of Motown in 1970 as Director of Electronic Music Research and Development, and his jazz compositions are quoted or used outright in many of Warner Brother’s cartoon themes. Close to the end of his life (he passed away in 

He also inspired a young Bob Moog. Check out this article from the Raymond Scott Website. 

https://www.raymondscott.net/features/bob-moog/

After his retirement, Scott used MIDI technology to continue composing until 1987, when he suffered the first of several debilitating strokes. Raymond Scott died in 1994.

by Simon Crab, 120years


Wurltizer Sideman Drum Machine 1959

The Wurlitzer Side Man generated sounds the same way the music boxes and the Rhythmicon did by using rotating disc.  There was also a slider to control the tempo (between 34 and 150 beats per minute). One other innovation of the Sideman which became a standard feature of drum machines is that the sounds can also be triggered individually by buttons on the front panel. 


1963 marks the start of the next part of music production and MIDI history

In 1963, Robert Moog was going to change the face of synthesis for ever by taking the synthesizer out of the university laboratory and putting it in the hands of musicians.   

In the next chapter of the history of MIDI, we will look at the sonic pioneers who explored how to build modular programmable synthesizers, drum machines, and sequencers and started the modern music production revolution that is still very much alive today. 


The MIDI Association Forms Prestigious Advisory Groups

The MIDI Association (TMA), a global community of people who use MIDI to create music and art, has been founded with the goals of providing education for existing users, as well as creating new music makers by promoting the creative possibilities of connecting digital musical instruments, MIDI controllers, smart phones, tablets and computers. To further these goals, TMA has established an advisory team made up of music industry veterans who bring diverse expertise to the organization. They will participate in various panels dedicated to setting The MIDI Association’s overall direction, developing marketing and social media initiatives, and creating funding opportunities.

The TMA advisory team will set overall direction for The MIDI Association, develop marketing and social media initiatives, and interface with private and institutional revenue sources.

The advisory team includes:

  • Craig Anderton, Executive Vice President, Evangelist at Gibson
  • Athan Billias, Director of Strategic Product Planning at Yamaha
  • Roy Elkins, CEO at Broadjam
  • Jon Haber, CEO at Alto Music and former NAMM board member and CEO of Alto Music
  • Dendy Jarrett, Director at Harmony Central
  • Gene Joly, former Guitar Center executive and past NAMM Board member
  • Daniel Keller, CEO at PR firm Get It In Writing
  • Robin Kelly, Director of Channel Management at Roland
  • Kevin LaManna, Principal at the digital marketing agency SocialRaise
  • Bryan Lanser, Director of Marketing at Muse Research
  • Paul Lehrman, Director of Music Engineering at Tufts University
  • Lawrence Levine, Principal at Comet Capital
  • Gerson Rosenbloom,Vice President of Strategic Management at Sweetwater and former NAMM Chairman.

“Over 30 years after the industry came together to create MIDI, it’s encouraging that it’s coming together again to help consumers as well as musicians take advantage of all that MIDI has to offer,” remarked Craig Anderton.

“The new MIDI website, (www.midi.org) has been completely revamped,” offered Roy Elkins,. “It’s now mobile friendly and features video streaming, interactive forums, and easy searches of the hundreds of articles on MIDI available on the site. It’s a great site that will support the whole global MIDI community.”

“In a world with billions of MIDI-enabled mobile computing devices, our goal is to create more music makers by promoting the vast MIDI capabilities offered when those devices are interfaced with musical instruments,” observed Gerson Rosenbloom. “We look forward to the active participation and support of companies and foundations in our industry in helping us to bring awareness to the masses.” .

©2015 MIDI Association

MIDI History:Chapter 3-Orchestrions

From mechanical to digital to virtual….and back!

The relationship between mechanical musical machines and MIDI gets even more intriguing with orchestrions and “fairground organs”. 

Orchestrions have multiple mechanical instruments in them and are designed to sound like a complete orchestra (hence the name).

Orchestrions are incredibly complex mechanical machines driven by pneumatic engines. They were used to attract visitors and fairs, theaters and bars between 1850 and 1930. 

The same companies that retrofit Player Pianos with MIDI adapters replace the paper roils of Orchestrions with MIDI interfaces that drive the pneumatic pumps.

Look at this article from Collectors Weekly about the 4000-pound grandfather to the iPOD!

All through history, people have wanted to be able to have music when they didn’t necessarily have musicians around. An iPod’s the same thing as a giant two-ton orchestrion 100 years ago, except you can put it in your pocket and enjoy whatever you want to hear.”

Collectors Weekly



 In 2010, Pat Metheny, a 20 time Grammy award winner and member of the Downbeat Hall fame, decided to put two of his life-long passions together. 

Pat had always loved playing MIDI guitar and had always had a fascination with Orchestrions. So he enlisted Eric Singer (a member of The MIDI Association’s educational advisory board) to help him develop a modern Orchestrion.


 Animusic is a company that develops virtual animated Orchestrions.


 A few years later in a perfect example of life imitating art, Intel decided that they wanted to take the virtual Orchestrion and recreate it in in real life using their processors and sensors.


Has this man lost his marbles? 

“It’s all about the grid,” Molin tells Michael Rundle writes for Wired UK. “I grew up making music on MIDI [a computer language for writing music], and everyone makes music on a grid nowadays, on computers. Even before digital they made fantastic, programmable music instruments. In bell towers and church towers that play a melody they always have a programming wheel exactly like the one that is on the marble machine.”

by SmithsonianMag


 And of course what drives all these fantastic artistic creations is MIDI!


 

MIDI History: Chapter 2-Player Pianos 1850-1930


The golden age of mechanical music machines really came in the late 19th century and early 20th century with player pianos and orchestrions.  A player piano is defined as any actual acoustic piano that is played by a pneumatic or electro-mechanical mechanism that operates the piano action via pre-programmed music.

By Daderot (Own work) [Public domain], via Wikimedia Commons

Between 1910 and 1930 player pianos were the largest segment of the music industry in the United States. These instruments were mostly used for playing back preprogrammed music via piano rolls. Remember back then there was no radio, TV or movies. If you wanted to hear music you either had to play it yourself or have it played for you by an automated music machine.


A piano roll is a continuous roll of paper with perforations (holes) punched into it that represent note control data. As the holes move over a ‘tracker bar’ each musical note is triggered when a perforation crosses the bar and is read.

Notice the words on right side of the role (read from bottom to top. Player pianos were actually the very first Karaoke machines, another function they have in common with MIDI !


This layout should look familiar to anyone who has a DAW. The piano roll view of MIDI notes is a direct descendant of these 100 year old piano rolls because MIDI does digitally exactly what a player piano does mechanically.


A new full-scale roll format, playing all 88 notes, was agreed at an industry conference in Buffalo in 1908, the so-called Buffalo Convention. Any player made anywhere in the world could now play any make of roll. Understanding the need for compatibility was the defining moment of the player industry.

The consensus was key to avoiding a costly format war, which plagued almost every other form of entertainment media that followed roll music (for example the video format wars of the 1980s).

https://en.wikipedia.org/wiki/Player_piano


Every entertainment media except for MIDI!  

The whole point of MIDI is to allow different manufacturers to work together and it’s the only media technology that is still widely used from 1983.


Reproducing Pianos

In Germany the first reproducing player piano called the “Mignon” was developed by Weltien in 1904. A reproducing player piano can play back a recorded performance with all the subtleties of dynamics and timing exactly as the original pianist played the piece. When World War I came in 1914, German patents were seized in the US, and Ampico (the American Piano Company) and the Duo-Art systems became the big players. Because these rolls could capture the subtle nuances of the performers (just like MIDI) in the rolls themselves, famous performers and composers including Gustav Mahler, Camille Saint-Saëns, Edvard Grieg, Claude Debussy, Scott Joplin, Sergei Rachmaninoff, Jelly Roll Morton and George Gershwin,  all recorded piano roll performances.

Most player pianos are now more than 100 years old the paper piano rolls are difficult to maintain and are deteriorating quickly.So there are companies that retro fit these classic 100-year-old pianos with MIDI interfaces.


More articles about pre MIDI automated music machines.

https://midi.org/conlon-nancarrow-and-black-midi

https://midi.org/midi-historychapter-3-orchestrions

MIDI History:Chapter 1- 850 AD to 1850 AD

To really understand the origins of MIDI, you need to go all the way back to before there were digitally controlled synthesizers and computers, In fact you need to go back before there was even electricity to the very first mechanical music machines.

The very first mechanical musical instruments were documented in the Book of Ingenious Devices published in 850 AD by three Iranian brothers known collectively as Banu Musa. They describe 100 mechanical devices including two automated musical instruments- a hydro powered organ that played music based on interchangeable cylinders that had music patterns on them.

A hydro powered organ that played music based on interchangeable cylinders that had music patterns on them.
 
This cylinder with raised pins on the surface remained the basic device to produce and reproduce music mechanically until the second half of the nineteenth century.”

 

 Charles B. Fowler (Music Educators Journal © 1967),

The first computer?

The Banu Musa also invented an automatic flute player that actually may have been the first programmable machine or computer. The flute sounds were produced by steam and you could modify settings to create different sounds and patterns. It almost sounds like what MIDI does today! These automated music machines were used for entertainment at parties just as today we stream music from our smartphones.

 


Mechanical Music Box

Muro Box Music Box

 

The same basic system of cylinders with music patterns was used for centuries and is still used today in carillons and music boxes.


https://www.midi.org/articles/midi-and-player-pianos

The golden age of mechanical music machines really came in the late 19th century and early 20th century with player pianos and orchestrions.