PR firm Rock Paper Scissors, Inc. explores the new terrain at the epicenter of music and technology through their PR and marketing services as well as the Music Tectonics Conference, podcast, online events, and in-person meetups.
Everything Music Tectonics does explores the seismic shifts that shake up music and technology the way the earth’s tectonic plates cause quakes and make mountains.
Rock, Paper, Scissors
On October, 23, 2024, Rock , Paper, Scissors hosted a creators fair at the at the Annenberg Beach House in Santa Monica. Rock, Paper, Scissors has been working with The MIDI Association for the last few years and at NAMM 2024 hosted an innovation meetup at The MIDI Association booth.
There were a lot MIDI Association companies at the Creators Fair event.
We also got to connect with a whole bunch of students from the Riverside City College Music Industry Club including club president Harley Glenn. Professor Jenny Amaya provides tons of students ( last year we had 60!) to help with our MIDI Association NAMM booth. One of the outcomes of yesterday was getting artiphon, Audio Modeling, Caedence, Geoshred and Robkoo to provide gear for the RCC Music Industry Club so they can start rehearsing for an all MIDI band performance at NAMM.
Music Industry Club (MIC) The purpose of this organization shall be to offer its members the opportunity to gain knowledge, insights, experience, and artistic development in the music industry and community.
Riverside City College
Roland
Roland had a booth showing off their newest samplers and had a Roland Showcase – Sampling Techniques with Jay Ybarra ft. the P-6 & the SP-404MKII.
We were just in Shanghai with Buchla artist TamiX and then got to to meet up with Peter Nyboer from Buchla at the Creators Fair.
Buchla
Blipblox
MIDI Innovation Award entrants, BlipBox was there.
Adam also did an interview with Seids, who works with a lot of MIDI Association companies including Apple Logic and Focusrite.
Dawn Audio
Diego Pinzon, from Dawn Audio was there and we also got a chance to meet co-founder, Mandy Ortiz. We spend a lot of time catching up on the DAW working group activities and the MIDI In Music Education Initiative.
Eternal Research
Another MIDI Association member and Innovation Awards entrant Eternal Research was displaying their Demon Box at the Creator Fair.
Demon Box Demo at Music Techtonics 2024
Art and Logic and Whirled Notes
We also got to catch up with MIDI Association members Andrew Sherbrooke from Art and Logic and Scott Barkley from Whirled Notes.
Crazy Pants, MIDI Hoodie and MIDI Hat
At the very end of the event, we ran into the organizer, Dmitri Vietz from Rock, Paper Scissors and one of the artist who gave a talk at the event, Ricky Tinez. Ricky was wearing his MIDI hoodie from his website.
Already looking forward to next year’s Music Tectonics
This was a really well produced event with tons of great content, lots of music tech start ups and of course a lot of creators who love MIDI.
We are planning on an event with Rock, Paper, Scissors at NAMM 2025 as they are in The MIDI Showcase.
We have even started planning for next year’s Music Tectonics event because it was such a great event in a stunning location.
Music China and the Chinese Musical Instrument Association (CMIA) have really stepped up their support for MIDI and digital music making and right next to The MIDI Association booth was the X Stage where we did a number of presentations over the course of 4 days.
Brian Hardgroove and Sennheiser Immersive Audio Presentation
Because of a long standing relationship, Sennheiser supported Brian Hardgroove’s appearance at Music China’s opening performance. He also did the first presentation on the X Stage next to The MIDI Association. Brian along with IMAC advisory board member Nevin Domer (who ran a punk record label in China and speaks fluent Mandarin) and Zhao Yajun, engineering specialist for Sennheiser presented a talk on the X Stage. Brian focused on the process of cross border, cross cultural communication in working on songs with Bian and Kong and Zhao Yajun focused on Sennheiser products particularly the HD490 Pro headphones.
The MIDI Association had worked with Music China and Sennheiser so people could come to The MIDI Association booth and register to receive a copy of a song written by Brian which featured Bian on Erhu and Kong on piano and also enter a drawing to win a pair of headphones.
This really helped us increase traffic at the MIDI Association booth.
MIDI Innovation Awards and Chinese Users Choice Awards Presentation
Finalists in the MIDI Innovation Awards were invited to participate in Synthefest UK 2024 , Music China 2024 and NAMM 2025. Here is an article we did Synth Fest UK on the event and a picture of from the event itself.
The Music China 2024 edition was an extraordinary and opportunity-filled experience for us at Audio Modeling. In collaboration with the MIDI Association and in the MusicX area, we had the chance to explore the Chinese music market and strengthen our presence in this key sector. Discovering the Chinese Music Market During the event, we had…
At Music China we had a great panel and participation by several of the winners and finalists. Here is the promotional video we created to promote the event at Music China.
Finalists in the 2024 MIDI Innovation Awards were also entered into the Chinese Users Choice Awards where Music China promoted the products in China and users were able to vote for their favorites.
Audio Modeling won both the MIDI Innovation Awards in the non commercial software category and the Chinese Users Choice Awards. We covered this in our article The MIDI Association At Music China 2024
This picture shows the ranking of the Chinese Users Choice Award winners.
There were a lot of MIDI Innovation Award finalists and winners at Music China including Audio Modeling, Arcana Strum, Party Maker and Flex Accordion.
Music Accessibility Presentation
Above is the promotional video we created to promote the event at Music China.
The Music Accessibility presentation at the X Stage featured two 2024 MIDI Innovation Award winners. Arcana Instruments won in the Commercial Hardware category for the Arcana Strum and Audio Modelingwon in the non-commercial software category for UniMIDIHub.
In addition to the MIDI Innovation Award winners, we invited two of brand ambassadors and a special guest, Liang Ge from Kong Audio . He showed up several minutes before the start of the presentation because he was interested in music accessibility. His company develops virtual instruments covering modern and ancient Chinese instruments as well as western orchestral instruments.
His English is great so we roped him into being the translator for the Music Accessibility presentation at the very last minute. Thanks to Kong Audio for their support.
MIDI In Music Education Presentation
With all the support we got last year from Xinghai Conservatory of Music and this year from Shanghai Conservatory of Music, we know that MIDI In Music Education is going to be an important topic in China in the future.
Zhao Yitian and Athan Billias from the MIDI Association executive board gave visitors to Music China an update on our plans to release a MIDI curriculum created by SAE Mexico and encouraged Chinese universities to translate our documents into Chinese.
In our next article, we’ll cover the 4 hour MIDI Forum that Music China arranged for Saturday, October 12, 2024.
We thought it was time to introduce the members who regularly attend these meetings so you can get to know their backgrounds and why they are passionate about making music making accessible to everyone. The Music Accessibility Standard Special Interest Group in The MIDI Association has been meeting regularly for over a year now every other Wednesday at 8 am Pacific Time.
Juho Tomanien- MASSIG Chair
Juho Tomanien is a student from Finland who had a vision for a Music Accessibility Standard. Just because you are blind, doesn’t mean that you can’t have a vision. Take Stevie Wonder whose Inner Visions album is considered one of the greatest records of all time.
Juho reached out to the MIDI Association after someone of KVR Audio suggested that perhaps The MIDI Association might be a good place to start up a conversation about Accessibility.
It turned out that there were already a number of MIDI Association member companies who had been working with accessibility consultants like UK producers/recording engineers Scott Chesworth and Jason Dasent and had already been adding accessibility features to their products. Juce which runs the Audio Developer Conference was also already onboard and had been hosting accessibility workshops at ADC for several years.
In 2023 at the ADC conference Jay Pocknell from the Royal Institute for the Blind hosted a packed session of developers trying to learn how to use Juce to add accessibility features to their products.
MIDI Association members from Audio Modeling, Arturia, Native Instruments, Roland and more have joined together with The Royal Institute for the Blind and visually impaired producers and musicians like Juho, Scott Chesworth Jason Dasent and more to discuss how MIDI 2.0 might enable Music Accessibility and make making music accessible to everyone.
Jay Pocknell-SoundWithoutSight.org and the Royal Institute For The Blind
As a sight-impaired musician, I first considered plans for a project in 2018. I was achieving my dream of working in commercial recording studios but also found that there were significant barriers to accessing the equipment and culture. I became aware that my fully-sighted peers could be more agile within the industry. Rather than becoming frustrated, I wanted to use my experience as motivation to inspire change. I refused to believe that I was the only person to have struggled.
Ph.D. student at the Music Informatics Laboratory at the University of Milan, in collaboration with Audio Modeling.
From a young age, I was captivated by the transformative power of music. However, as I delved deeper into the world of digital instruments, I began to notice a glaring gap in accessibility. The tools that are supposed to empower creativity often come with barriers that exclude many from fully engaging in music-making. This realization fueled my desire to create digital instruments that are not only innovative but also inclusive, designed to be accessible and usable by everyone, regardless of their physical abilities or technical expertise.
Through my research, I aim to challenge the status quo by designing instruments that are adaptable and intuitive, ensuring that the joy of music is within reach for all. To deepen my understanding and to extend my vision of accessibility, I joined MASSIG (the Music and Accessibility Special Interest Group). Being part of this community has allowed me to connect with musicians who face real challenges in their everyday interactions with digital instruments. These interactions have been invaluable in shaping my research, providing me with insights into the practical needs and problems that must be addressed.
My journey is driven by the belief that music should be a universal language, and to achieve that, the tools we create must be as inclusive as the art form itself. As I continue my work, I am committed to breaking down barriers and ensuring that every musician, regardless of their background or ability, has the opportunity to express themselves through digital music. Short bio: I’m Vanessa, and I am currently pursuing a Ph.D. at the Music Informatics Laboratory at the University of Milan, in collaboration with Audio Modeling. My passion for music and technology has led me to a research project that sits at the intersection of both fields: the ideation and development of accessible digital music instruments.
Emanuele Parravicini, co-founder and CTO of Audio Modeling
Emanuele Parravicini is the co-founder and CTO of Audio Modeling. He holds a degree in Telecommunications Engineering and a Master’s in Information Technology. With a deep passion for music and music technology, Emanuele is a strong advocate for inclusivity, particularly in the realm of musical technology.
At Audio Modeling, he leads the development of innovative products, including the UniMIDI Hub, a software platform designed to integrate various accessible devices. This platform enables real-time control of digital instruments or music production software by multiple users simultaneously, whether they are individuals with disabilities, students, amateurs, or professionals.
Emanuele’s commitment to making music technology accessible to everyone is the driving force behind his involvement in the MASSIG – Music Accessibility Standard Special Interest Group. His expertise and dedication to inclusivity in the field align perfectly with the group’s mission.
I am Emanuele Parravicini, co-founder and CTO of Audio Modeling. I hold a degree in Telecommunications Engineering and a Master’s in Information Technology. With a deep passion for both music and music technology, I am a strong advocate for inclusivity, especially within the realm of musical technology.
In my role at Audio Modeling, I oversee the development of our products, including one of our latest projects, the UniMIDI Hub. This software platform aims to integrate various accessible devices, allowing real-time control of digital instruments or music production software by multiple users at once, whether they are individuals with disabilities, students, amateurs, or professionals.
My commitment to making music technology accessible to everyone is what led me to join the MASSIG – Music Accessibility Standard Special Interest Group. I believe that inclusive technology can empower people to create and enjoy music, and I am dedicated to contributing to this important mission.
Emanuele Parravicini
Haim Kairy- CEO – Arcana Instruments
I am an engineer, software developer, maker, entrepreneur, and musician, currently serving as co-founder and CEO of Arcana Instruments. Throughout my professional career in technology startups and consulting for banks, energy, and insurance companies, I never found a true sense of purpose. Music has always been central to my life, playing, recording, producing, and performing. Without it, I would surely be lost.
Eight years ago, everything changed when my friend Boaz, a music-school teacher, showed me a video of Gil, a 12-year-old girl with cerebral palsy, trying to play the guitar. Unfamiliar with CP, I was moved by her passion and frustration as she struggled to control her hands and fingers. Realizing that no suitable instrument existed for her, we were heartbroken. However, seeing Gil controlling her motorized wheelchair with a joystick inspired us to create an accessible musical instrument she could play.
What began as a hackathon project for one girl evolved into a year of research, working with individuals aged 5 to 95 to develop the Arcana Strum, an inclusive, adaptable musical instrument. We raised capital from angel investors and government grants, left our day jobs, and fully committed to this mission.
Meeting hundreds of people with disabilities and making music accessible to them has made a huge impact on me. I’ve come to understand that a significant portion of the population is systemically excluded from learning and playing music, a body-brain-developing, spirit-lifting activity I once took for granted.
As part of the MIDI Association Special Interest Group, Arcana gains invaluable insights from musicians and professionals about music accessibility needs. We aim to contribute our resources, knowledge, and experience to create an inclusive and accessible music landscape for everyone.
Micheal Strickland , Specialized Faculty in Music Technology at Florida State University
Michael Strickland is a passionate advocate for music education and accessibility, leveraging his expertise in music technology to create innovative learning and performance environments. As a member of the MIDI Association, he participates in the Music Education and Accessibility Special Interest Groups, where he promotes the use of MIDI as a tool for enhancing musical expression and inclusion. He is also a skilled performer, engineer, and researcher, with a diverse background in physics, music, and improvisation. He is honored to serve as a judge for the MIDI Innovation Awards and celebrate the creative potential of MIDI.
I have previously worked in the broadcast industry, delivering control and monitoring interfaces for visually impaired users and improving access to various production tools.
I know that improvements to a product’s accessibility improves workflows for all of its users and that considering the access to all elements of a product, for all users at the earliest stage of design is the best way to deliver a product that works best for the widest audience. This is not just a social benefit, it is good business practice. Whilst engaging where necessary with product modifiers (such as screen readers), I much prefer to design products that work for each user’s ability. I consider what interface the users can use and how they can access the features of a product. That includes forgetting about GUI for totally blind users, reconsidering tactile controls for users without physical reach or dexterity, designing workflows with appropriate levels of complexity for users with different cognitive abilities, etc.
All of these considerations, and more should be fitted into as many of our products as we can. The core learning is that user interfaces may be significantly different for different users of the same product, to provide a similar user experience.
I have done this in the past and strive to do so with the projects I work on now and in the future. It is this experience, skillset and aspiration to improve musicians experience that attracts me to MASSIG.
Design for all and all benefit.
Sam Prouse- Music Technologist
Samuel Prouse is a passionate music technologist whose journey with sound began in the vibrant 1980s, inspired by the iconic synths that defined the era. Growing up, He channelled this passion into building rigs, DJing, and live performances, initially as a hobby. In 2008, a deep love for music technology led to a bold career shift, prompting him to pursue formal education in the field. However, just before beginning these studies, life took an unexpected turn with the sudden loss of sight in the left eye, later leading to a diagnosis of multiple sclerosis. By the end of his BTEC studies, He was registered visually impaired, which brought unique challenges to his educational journey and professional aspirations.
Despite these obstacles, Samuel demonstrated remarkable resilience, earning a foundation degree, a BA Hons, and an MSc in Music Technology. Throughout this academic journey, Samuel explored the creative implications of device interfaces, both hardware and software, with a particular focus on visual accessibility. In 2021, when JUCE released version 6.1 with enhanced accessibility features, he developed skills in C++ and interface design, driven by a commitment to improving accessibility in music technology.
After graduating with distinction, Samuel ‘s PhD bursary proposal was accepted, allowing him to focus his research on universal design concepts within the music technology industry. As an active member of the Music Accessibility Standard Special Interest Group, Samuel has found a collaborative space to share innovative ideas and concepts, further strengthening his research. Being part of such a committed group has provided constant inspiration to develop new ways of interacting with music technology. Today, he is dedicated to pioneering new ways of interacting with music technology, ensuring that the industry evolves to be more inclusive and accessible for all.
Same Prouse Highlights Music Accessibility on YouTube
Music Accessibility at Music China and ADC
We are planning a number of upcoming Music Accessibility events – one at Music China in October and then another led by Jay Pocknell at Audio Developers Conference in November.
Here is a promo video for the event at Music China and then a list of recent posts.
Ikutaro Kakehashi’s impact on music production is undeniable. He founded Roland and oversaw some of the most seminal MIDI products in history. When he passed away on April 1, 2017, we created an article about his life and times.
Today marks the seventh anniversary of his passing.
Just a week ago, Athan Billias, MIDI Association Exec Board member was in Tokyo after meetings in Hamamatsu with the AMEI Piano Profile working group. He was able to catch up and meet with Kakehashi’s son, Ikuo.
Ikuo updated The MIDI Association on what is going on with Kakehashi Foundation.
The Kakehashi Foundation is a non-profit organization that contributes to the promotion and spread of art that applies electronic technology.
Message from the Chair
The Kakehashi Arts and Culture Foundation is a public interest incorporated foundation based on the surname “Kakehashi” of the foundation’s founder, Ikutaro Kakehashi (founder of Roland Co., Ltd.), and the foundation of the arts and culture that give dreams and hopes to people in Japan and around the world. I was reborn with the desire to become a “kakehashi” Unfortunately, the promotion of art and culture in Japan is lagging behind developed countries due to economic priority and results-based culture. I would like to create a society where culture leads the economy. From now on, our foundation will continue to spread joy and excitement to as many people as possible through electronic technology and electronic musical instruments, in accordance with our basic philosophy of promoting and disseminating artistic culture that applies electronic technology. Specifically, we hold performances and lectures using electronic technology, conduct appropriate and effective support activities in the field of art and culture, and conduct educational development projects such as the Japanese representative office for the Royal Musical Examinations. We would like to contribute to the spread of this technology. We will continue to carry out even more fulfilling business activities in the hope of realizing a rich and creative society. We look forward to your continued support and cooperation. Akiko Santo, Chairman of the Kakehashi Arts and Culture Foundation
Artware Hub in Shinjuku
MIDI Association member AVID has partnered with the Kakehashi Foundation and created a very unique space for electronic music performances.
Shinjuku has a long history as one of Tokyo’s most vibrant areas, and Waseda is arguably the district’s cultural and creative nexus. The area is home to the renowned Waseda University, as well as numerous museums, theaters, night clubs, and performance spaces.
Waseda’s newest venue, the Artware hub, opened its doors in late 2019. Commissioned by the Kakehashi Foundation (Arts and Cultural Foundation), the space is the realization of a lifelong dream of Roland Corporation founder and visionary Ikutaro Kakehashi.
What appears at first glance to be a fairly typical live performance venue is, in fact, much more. Designed as an experimental acoustic space, the facility incorporates an unparalleled 36.8 multi-channel immersive audio system based around an Avid VENUE | S6L live sound system and FLUX::Immersive’s Spat Revolution software engine.
We found a few favorites from MIDI Association members and friends we wanted to share with everyone.
The cover image above is from Jeff Rona who worked for Roland when the 808 first came out and was the very first President of the MIDI Association.
The YouTube video is from MIDI Association member Melodics and the photo and link at the bottom is of course from Roland and links to their Roland Cloud 808 page.
To find out more about this sweet ride, you’ll have to go over to Peter Kirn’s excellent 808 Day roundup to read about 1500 Academy’s auction for this one of kind Roland 808 inspired BMX bike.
One of the things that has always made MIDI unique in the world of standards is that no one owns MIDI and the MIDI Associations (AMEI in Japan and The MIDI Association in the rest of the world) don’t sell anything.
We (AMEI and The MIDI Association) get companies to volunteer their staff to work on really complex problems (like MIDI 2.0), work together to solve those problems and once those problems are solved, we give away the solutions and specifications for free so anyone can use them to make MIDI products.
MIDI is also unique because it enables all kinds of instruments – keyboards, drums, wind controllers, pad controllers, lights, and in fact anything that can be controlled digitally can be controlled by MIDI.
There really is no other standards that works quite the same way and it is what makes MIDI so very special.
MIDI 2.0 meeting in Japan March 27,28, 29, 2023
A great example of the collaboration is the meeting that happened in Japan recently.
All of the major OS companies gathered to talk about details of handling MIDI 2.0 in operating systems. We understand that people are waiting (some not so patiently) for MIDI 2.0, But there was one thing that was clear to both AMEI and The MIDI Association since the very beginning of discussions about MIDI 2.0- we couldn’t break MIDI 1.0!
The spirit of cooperation between these OS companies and musical instrument manufacturers was awe inspiring.
Companies set aside their differences and competitive natures to the goal of the meeting was to cooperate together for the greater benefit of musicians around the world.
In a remarkable coincidence, the presentation by the OS companies that was made at Audio Developer Conference 2022 was put up on Youtube by ADC on the second day of the meeting in Japan.
Audio Developer Conference 2022 video featuring Pete Brown, Phil Burk, Torrey Holbrook Walker & Mike Kent available
Engineers from Apple, Google, and Microsoft will present the current state of MIDI 2.0 implementations in their operating systems.
We’ll describe the API changes required for MIDI 2.0 for each platform as well as discuss the philosophy and reasoning behind various design decisions. We’ll also present the status of transports, such as USB and Ethernet.
If you’re a developer who is interested in the practical implementations of MIDI 2.0, this is the session for you.
Pete Brown
Pete works in the Windows + Devices org in Microsoft, primarily focusing on partners, apps, and technology for musicians. He’s the lead for the Windows MIDI Services project which is bringing an updated MIDI stack to Windows, and adding full MIDI 2.0 support.
He also serves as the current Chair of the Executive Board of the MIDI Association.
When not working, he enjoys synthesizers, electronics, woodworking, astrophotography, 3d Printing, and CNC projects at his home on the east coast of the US. _ Phil Burk Music and audio software developer. I
nterested in compositional tools and techniques, synthesis, and real-time performance on Android. Worked on HMSL, JForth, 3DO, PortAudio, JSyn, WebDrum, ListenUp, Sony PS3, Syntona, ME3000, Android MIDI, AAudio, Oboe and MIDI 2.0.
Torrey Holbrook Walker
I am a senior software framework engineer on the Core Audio team at Apple and a frequent MIDI specification contributor and prototyper with the MIDI Association. I have been passionate about creating music production technologies that delight audio software developers, musicians, and music producers over my 16-year career with Apple.
You should talk to me about:
– MIDI 2.0 software implementations or USB MIDI 2.0 hardware support.
– CoreMIDI APIs, USB MIDI, or BLE MIDI.
– The best food in London.
– Analog high-end audio and vinyl.
– Autechre, Aphex Twin, Squarepusher, and any wonky music with a shedload of bass.
Mike Kent
Mike Kent is the Co-Founder and Chief Strategy Officer of AmeNote Inc. Mike is a world leader in technology for musical instruments and professional audio/video.
Mike is the Chair of the MIDI 2.0 Working Group of the MIDI Association. He is a co-author of USB MIDI 1.0, the principal architect of USB MIDI 2.0, and Chair of the MIDI Working Group of the USB Implementers Forum.
The MIDI Association and MIDI Association members were at CES in Las Vegas
The MIDI Association has had a long relationship with CES as an Affiliated Association.
CES helps to support organizations that maintain technical standards like MIDI.
At the 2023 CES show, there were a number of MIDI Association companies that also participated in the show.
In fact, PTZ Optics and Roland both won best in show prizes. Here is a quick wrap up of MIDI at the CES Show 2023.
Major Trends-Games In Cars
At the last Game Developers Conference, we hosted a panel on the possibilities of games in automobiles and at this CES, Reuters confirmed that this was indeed a growing industry trend.
At our MIDI Association booth, we featured a wide variety of MIDI controllers which drew a lot of attention.
BLE controllers like the Genki ring, the Oddball and the SOM-1 attracted interest because they showcase how small, yet powerful MIDI devices can be.
The Novation Launchpad and Embodme Arae Touch always intrigue people because of their unique designs and colorful flashing displays.
But perhaps the product that got the most attention was the Protozoa because of its Open Source design based on an affordable and available Pico chipset and of course because of its MIDI 2.0 capabilities.
During the show we were able to meet with tons of people who used MIDI every day and many developers who expressed interest in participating in the MIDI Innovation Awards 2023.
Analog Devices
Analog Devices MIDI 2.0 A2B Demo at CES 2023
Analog Devices was showing working prototypes of MIDI 2.0 over A2B. A four piece band was connected together for both audio and MIDI by inexpensive cabling.
Embodme
Embodme was not only showing their Erae Touch MIDI Controller (which is already running MIDI 2.0 UMP messages internally), but a brand new technology called Super Iris targeted at the growing digital advertising market.
You can now interact safely with contact-free hovering technology. This is ideal for input devices used in medical, industrial, and public facilities applications such as automatic check-in machines, automatic ticket dispensers, vending machines, ATMs, and elevators. Prevent the spread of viruses while engaging customers to use Self Check-in Kiosks.
by Embodme
PTZ Optics
PTZOptics was showing off their MIDI controlled Pan Tilt Zoom cameras and also won a Best In Show award from VideoMaker for their PTZOptics Studio Pro.
Roland had a large, impressive booth at the very front of the Central Hall. They displayed a wide range of products, but we’ll focus on a few of the standouts.
Roland’s 50th Anniversary Concept Piano
Roland showed a 50th Anniversary Concept Piano complete with Drone Speakers!
Take online gaming sound to the next level with BRIDGE CAST, your all-in-one solution for premium livestream audio. This customizable desktop hub is packed with secret weapons to take out the competition, including dual sound mixes, vocal transformer effects, music playback, sound effects, and support for a broadcast-grade mic and headphones. In the heat of the battle, BRIDGE CAST ensures that your audio is always as epic as your gameplay.
by Roland
The Bridge Cast also won a Best In Show award from Videomaker.
Dave Smith was born in San Francisco in 1950 and like Dave Rossum grew up in the Bay Area in the 1950s.
He took piano lessons as a child and started playing bass and guitar in rock bands in high school because it was after all the 1960s in San Francisco.
When the record Switched on Bach came out in 1968, Dave bought a copy of the record and was intrigued by the sounds coming from the Moog modular synth.
Just like Don Buchla 10 years earlier, he went to college at the University of California, Berkeley where he earned a degree in computer science and electrical engineering.
One of his college projects was a very primitive program to write music on a printer plotter.
After graduating he got a job in the Aerospace industry.
Yes, I was working in the aerospace industry. This was a time when nobody wanted to hire engineers. I was in what was to become Silicon Valley, but it was not quite Silicon Valley yet, so it was very early on in the technical revolution, I suppose you might say.
So I worked at Lockheed doing stupid work, because that was the only place I could get a job, and a friend told me he saw this synthesizer thing in a music store, and I said, "Oh, that sounds interesting."
So I went to look at it, and it was a Minimoog, and I had no idea what it did or how it worked. It just looked cool, and it was as kind of a perfect combination of my music background and technical background.
So the next day I went to the Lockheed Credit Union and got a loan and went back and bought it, and here I am.
by Dave Smith in a 2014 interview with Red Bull Academy
Interconnections -John Bowen, Bob Moog and Dave Smith
John Bowen and Bob Moog in Tokyo, Japan in 1973
John Bowen went to UC Berkeley (as did Don Buchla and Dave Smith) where he was introduced to Moog Synthesizers.
In 1972, he rented a Minimoog to learn synths from Pat Gleason of Different Fur Trading company. He then
went to a CES show and convinced Bob Moog that he was the right person to demonstrate Moog synthesizers and so John moved to Buffalo, New York and became the first official Moog clinician in 1973.
In 1976 he met Dave Smith, and started working with Dave to promote his Model 800 sequencer, and then helped specify the Model 700 Programmer.
So through John Bowen there is a direct connection between Bob Moog and Dave Smith.
This would later be important in the development of MIDI.
John Bowen worked for MOOG Music product clinician demoing music equipment for the major of the 1970s. He later went to work with Dave Smith helping with the early Sequential Circuits products to do limited programming and sequencing for the MiniMOOG as well as suggestions that influenced the design of the PROPHET-5.
Sequential Circuits
MODEL 600 ANALOG SEQUENCER
1974
Dave bought his first synthesizer for $1500 (a Minimoog) in 1972 and immediately started to think about designing peripheral products to get more out of the Minimoog.
He bought books about electronic circuitry and microprocessors and he studied the analog sequencers that Moog and Buchla had designed for their modular synths. Soon his hobby was becoming a business and in 1974 he formed Sequential Circuits, a name that described exactly what he was building and released the Model 600 Analog Sequencer- a 16 step sequencer using analog control voltages.
Sequential Circuits
MODEL 800 DIGITAL SEQUENCER
1975
Model 800 from Sequential website
Dave was gaining more and more experience with microprocessors and the Model 800 had the ability to record 16 banks of 16 sequences. You could input the steps in real time or in step time. The Model 800 didn’t make any sound so you needed a voltage controlled synthesizer to connect it to.
It was similar to the Oberheim DS2 digital Sequencer that Tom Oberheim had released to control the Arp 2600 in 1972.
Sequential Circuits MODEL 700 PROGRAMMER
1976
Model 700 from Sequential website
Sequential Circuit’s next product was remarkable for a number of reasons.
First, it was all about controlling other products (like MIDI would do a few years later). The Model 700 didn’t make any sounds, it actually added programmability to either the Minimoog or the Arp 2600.
It was one of the first products that stored presets. It could store 64 programs (8 banks of 8 programs).
You could store the settings for attack, decay, sustain and release. There was also a built-in sequencer.
Also it’s important to note that the Model 700 was really starting to look like a Sequential Circuits product with the buttons, the knobs and design elements (like the white border around the core programming area) that would soon become famous with the release of the Prophet 5.
Sequential Circuits
Prophet 5
1977
Prophet 5 from GreatSynthesizers.com
Perhaps no other synthesizer had as much impact on the professional synthesizer business around the world than the Sequential Circuits Prophet 5 released in 1977.
It didn’t sell the most units -estimates range between 8000 and 6000 units of the three Prophet 5 variations created between 1977 and 1984. The Korg M1 holds the record for largest selling synth of all time with over 300,000 sold and the Yamaha DX7 comes in second.
But the Prophet 5 was the first product where a number of important factors came together.
It had 5 voices of polyphony and each voice had 2 VCOs, a VCF with ADSR and a VCA with ADSR.
Rev. 1 and Rev. 2 models had the SSM2040 filter chip. Rev. 3 (and higher) used the CEM3320 filter.
But what really set the Prophet 5 apart was that the whole synth operated by Z-80 microcomputer that controlled the keyboard scanning and voice assignment (under a patent licensed from Dave Rossum of EMU fame), the storage of sound presets (40 memories, and later 120) and the oscillator calibration to keep the oscillators in tune.
The Polymod section was designed by Sequential’s John Bowen who created all the Prophet 5 factory Presets and who was also instrumental in MIDI’s early development.
Prophet 5 Polymod section from Greatsynthesizers.com
If there was a single feature that defined the Prophet sound, it was the poly-mod section, which enabled you to use the filter envelope and OSC 2 to modulate the frequency of OSC 1, the pulse-width of OSC 1, and/or the filter cutoff frequency. These modulation routings, combined with OSC 1’s sync function, produced the trademark (and at one time hopelessly overused) oscillator sweeping sync sound, usually variations of what was originally factory preset 33.
The PolyMod sound is instantly identifiable on The Cars song “Let’s Go.
The extended version of George Clinton’s Atomic Dog also shows off the Prophet 5 Poly Mod sound. There are also two synth bass parts-one is a Minimoog and the other is a Prophet 5. The drum sound is a Roland TR606 played in reverse!
Before the Prophet-5, synthesizers required users to adjust cables and knobs to change sounds, with no guarantee of exactly recreating a sound.The Prophet-5, with its ability to save sounds to patch memory, facilitated a move from synthesizers creating unpredictable sounds to producing "a standard package of familiar sounds". According to MusicRadar, the Prophet-5 "changed the world – simple as that".The Prophet-5 became a market leader and industry standard.The Cars keyboardist Greg Hawkes used the Prophet-5 for the band's hits "Let's Go" (1979) and "Shake It Up" (1981).Kraftwerk used it on their 1981 "Computer World" Tour.David Sylvian used it on Japan's 1982 hit single "Ghosts" and Richard Barbieri of the same band has used it frequently.Michael Jackson used it extensively on Thriller (1982), and Madonna used it on Like a Virgin (1984).Peter Gabriel considered the Prophet-5 his "old warhorse" synthesizer, using it for many sounds on his 1986 album So.Brad Fiedel used a Prophet-10 to record the soundtrack for The Terminator (1984), and the filmmaker John Carpenter used both the Prophet-5 and Prophet-10 extensively for his soundtracks.The Greek composer Vangelis used the Prophet-5 and the Prophet-10, the latter for example in the soundtrack of Blade Runner (1982).The Prophet-5 was widely used by 1980s synth pop acts such as Orchestral Manoeuvres in the Dark, Tears for Fears, Thompson Twins, Thomas Dolby, Devo, Eurythmics, Soft Cell, Vince Clarke and Pet Shop Boys.Radiohead used the Prophet-5 on their 2000 album Kid A, such as on the song "Everything In Its Right Place".Other users include Giorgio Moroder, Tony Banks, Phil Collins, Tangerine Dream, Jean-Michel Jarre, Dr. Dre, Richard Wright, Rick Wakeman, Pendulum, BT, and John Harrison.by Wikipedia
The intro to the video for Hall and Oates November 1981 release “I Can’t Go For That” looks like a Prophet 5 demo reel with Hall playing all the intro parts on a Prophet 5 and even changing Programs in real time.
It also features a drum part programmed by Daryl Hall on a Roland CompuRhythm CR-78.
This article focuses on the seminal products that Dave Smith, John Bowen and Sequential Circuits created before 1983,
The Prophet 5 was an important part in the build up to MIDI and we will see how the digital sequencers that Sequential Circuits was working on would lead directly to the need for the universal digital musical interface that we now call MIDI.
In Chapter 6 and 7 of the History of MIDI, we look at how Sequential Circuits, Kawai, Korg, Roland and Yamaha came together to create MIDI and the roles that Dave, John and both Tom Oberheim and Bob Moog had in creating important personal connections for MIDI.
Prophet 600
The first MIDI synthesizer
December, 1983
Prophet 600 from Perfect Circuit Audio
Sequential Corporate History
Sequential Circuits
1974-1987
Sequential Circuits released the products listed above and also the following products-
Max (1984)
Six-Trak (1984)
Drumtraks (1984)
Multitrak (1985) ( replaced the Six Trak)
Split-8 (1985)
TOM (1985)
Prophet 2000 (1985–87)
Prophet-VS (1986–87)
Studio 440 (1987)
Dave Smith Division
Yamaha
1987-1989
Dave Smith was President of the Dave Smith Division of Yamaha based in San Jose.
Sequential Circuits ran into financial difficulties because they had invested heavily in both a product roadmap for sequencing software, the Prophet 2000 sampling technologies.
They met with Bryan Lanser (former MIDI Association Exec Board member) who was working for Otari at the time. Sequential thought that the Prophet 2000 sampling technology would be a really good fit for Otari to get into the hard disk recording business.
Pro Tools founders Evan Brooks and Peter Gotcher had expanded from just making EPROMs for Emu’s Drumulator and developed their Sound Designer program for the Macintosh which worked with to many other sampling keyboards, such as E-mu Emax, Akai S900, Sequential Prophet 2000, Korg DSS-1, and Ensoniq Mirage. Thanks to the universal file specification subsequently developed by Brooks with version 1.5, Sound Designer files could be transferred via MIDI between sampling keyboards of different manufacturers.
But when the second meeting with Otari was scheduled, it was people from Yamaha who showed up instead. John Bowen recounts that when Dave and John first went to Hamamatsu to visit Yamaha in 1987 and saw stacks of Yamaha TX16W samplers, they both realized that Yamaha was probably not interested in Sequential’s sampler technology.
Dave and John (along with a team that included Alex Limberis from Ensoniq) worked for 2 years on physical modeling and softsynths projects, but never came out with a product under the Yamaha brand name.
Korg R&D
San Jose, Ca
1989-1994
In May 1989 he started the Korg R&D group in California, which went on to produce the innovative and commercially successful Wavestation synthesizer and other technology. We will tell the story of the creation of Korg R&D and the Wavestation in another installment of MIDI History.
Korg R&D released a number of spinoff of the Wavestation and also started to work on the core technology for the Korg Oasys.
Seer Systems
San Jose
1994-2001
Smith reunited with Stanley Jungleib who had worked with at Sequential and served as president at Seer Systems which developed the world’s first software based synthesizer running on a PC.
This synth was commissioned by Intel to prove the power of Intel CPUs. The second generation of Seer Systems software was licensed to Creative Labs in 1996 and used in the Creative Labs’ AWE 64 line of soundcards which were developed by Dave Rossum from Emu (EMU having been acquired by Creative Labs in 1993).
The third generation of Seer Systems software synthesizers was called Reality and was released in 1997.
Dave Smith Instruments
San Jose
2002-2015
In 2002, Smith launched Dave Smith Instruments, a manufacturer of electronic musical instruments.
Dave Smith Instruments released the following products:
Evolver (2002)
Poly Evolver (2005)
Mono Evolver (2006)
Prophet 08 (2007–16)
Mopho (2008)
Tetra (2009)
Tempest (2011) co-created with Roger Linn
Prophet 12 (2013)
Pro 2 (2014)
Sequential Circuits
2015-Present
In 2015, Smith regained the rights to the Sequential name from Yamaha, and released the Prophet-6 under that name.
Ikutaro Kakehashi, who had worked with Smith to create MIDI and was in failing health reached out directly to the President of Yamaha, Tak Nakata.
Kakehashi said: “I feel that it’s important to get rid of unnecessary conflict among electronic musical instrument companies. That is exactly the spirit of MIDI. For this reason, I personally recommended that the President of Yamaha, Mr. Nakata, return the rights to the Sequential name to Dave Smith.”
Kakehashi passed away at age 87 in 2017.
Dave Smith Instruments was rebranded as Sequential in 2018.
On 27, April 2021, Sequential announced that it had been acquired by the British audio technology company Focusrite.
Dave Smith passed away on 31 May, 2022 just a few days before his friend for many years Tom Oberheim would launch the OB-X8 at the 2022 June NAMM show.
Sequential Products released since 2015
Prophet-6 (2015–present)
OB-6 (2015–present) (co-created with Tom Oberheim)
Prophet Rev2 (2017–present)
Prophet X (2018–present)
Pro 3 (2020–present)
Take 5 (2021–present)
Trigon (2022–present)
Sequential OB6
Tom designed the VCO and VCF sections and Dave provided the arpeggiator/step sequencer, effects and production capabilities. After 30 years, two of the pioneers of modern synthesis were working together to design new products.
2022 -Oberheim Electronics reopened and Oberheim OB-X8 released
At the 2022 June NAMM show, Oberheim Electronics showcased the new OB-X8. What was supposed to be a joyous celebration was dampened by the news that just days before the June NAMM show, Dave Smith had passed away.
This is a picture from 2019 of Dave Smith, Tom Oberheim, Marcus Ryle (an engineer at Oberheim when still in his teens and founder of Line 6) and Roger Linn.
For more information about Dave Smith and Sequential, please see these excellent resources.
Sequential was founded led by legendary instrument designer and Grammy-winner Dave Smith. In 1977 Dave designed the Sequential Circuits Prophet-5, the world’s first fully-programmable polyphonic synth, and the first musical instrument with an embedded microprocessor. Sequential released many innovative instruments and drum machines over the next 10 years.
Today, Sequential’s talented and dedicated team of designers and synth fanatics continue Dave’s legacy in accordance with the spirit of innovation and ingenuity Dave embodied and imparted during his lifetime.
Roger Linn was born in Whittier, California in 1955.
Roger learn to play guitar growing up in the 1960s and when he was in high school, he started messing around with electronics.
While in high school I modified a fuzz tone product called the Foxx Tone Machine with some simple filters to make it sound more like real guitar amp distortion. I sold one to a little-known girl band at the time called Fanny. It apparently failed on stage, causing a news station to come through the P.A. Also during that time I had an after school job installing pickups and electronics in guitars, and had more switches in my guitar than an electronic voting machine in a California governor recall election.
by Roger Linn in an interview for Sonic State
Billboard promoting a Fanny concert at the Whiskey in the late 1970s
Career as a Songwriter and Guitarist
In 1976 at age 21, Roger went out on tour as a guitarist with the pianist/songwriter Leon Russell. That’s Roger on the left.
Roger Linn also wrote several hit songs including “Promises” for Eric Clapton in 1979 and “Quittin’ Time” penned with Robb Royer from Bread and covered by a number of people including Mary Chapin Carpenter.
The first sampled drum machine- the LM-1
Roger was doing well as a songwriter, but had a simple problem. He wanted to have realistic drum parts for his songwriting demos and there was nothing that sounded like a real drummer back then.
Plus he was on the road with Leon Russell and wanted to be able to work on songs in his hotel room.
So at age 22, he started working on the LM-1.
Two years later he was ready to announce the first drum machine to use samples- the LM-1. It was Steve Porcaro from Toto who it is rumored to have suggested to Roger that he should look at using real samples in a drum machine.
He started a company called Moffett Electronics with Alex Moffett. The LM in the product name is for Linn and Moffett.
LM-1 Drum Computer
When I would do demos of the early LM–1 prototype, people’s jaws would drop. They were amazed to hear the sound of a real drum when they hit a button. So I knew I was on to something.
by Roger Linn in an interview with Reverb
Life and Love Leon Russell 1979
Leon Russell is probably not the first name you think of when mentioning synthesists of the 1970s.
But as we documented in our article about Tom Oberheim, Leon Russell had purchased an ARP 2600 from Tom when Oberheim was the ARP dealer in LA.
On his album “Life and Love” released in 1979, Leon was the first artist to use a sampled drum machine on a record.
In fact, Roger Linn not only played guitar on the record, he is credited as a co-producer because he wrote the drum parts on the LM-1. In fact, there is NO drummer on the record so all of the drums were done by the LM-1.
Credits for the album Life and Love.
Leon Russell – guitar, keyboards, piano, vocals – Producer – Written-By
Roger Linn – Co-producer – drums – Moffett Electronics – Engineer – Guitar
Marty Grebb – guitar, saxophone
Jody Payne, Roger Linn – guitar
Here is the title track from Leon Russell’s 1979 album.
Russell is also given credit by Roger Linn for adding two important features to the LM-1 that would become staples of almost all digital drum machines in the future- Handclaps and a Swing feature.
At first Roger didn’t think handclaps were necessary because you could record them so easily. Russell convinced Roger that after three choruses of hand claps, your hands would be pretty beat up so it would be better to have a consistent recorded sample.
In his NAMM Oral history interview below, Leon Russell talks about how he convinced Roger to add Handclaps ot the LM-1. Russell also talks about admiration for Hal Chamberlin who developed the Mellotron and how Ikutaro Kakehachi, President of Roland came to visit him in Tulsa, Oklahoma. It’s a surprising look at how into technology Leon Russell was.
It was Leon who taught me about swing timing, which he called ‘shuffle. He explained that one of the big factors in a drummer’s feel was the degree of shuffle timing in his playing. Some drummers played straight sixteenths with a hint of shuffle.
I added the code to delay — by a variable amount — the alternate 1/8 or 1/16 notes, thereby turning a straight beat in a shuffle/swing 1/8 or 1/16 beat and by an adjustable amount. This allowed me to dial in the exact groove I wanted.”
by Roger Linn in his Reverb interview
Getting the LM1 off the ground
Things weren’t easy at the beginning because Roger didn’t have experience in producing products, he was a songwriter and guitarist. He borrowed $20,000 from his father to help fund the development and when that ran out, he paid the person who designed the printed circuits boards for the LM-1 by giving him Roger’s 12 year old Porsche.
He had a prototype designed, but the exterior case wasn’t ready yet so Linn would haul the LM-1 around in a cardboard box to Hollywood parties and got a number of people including Peter Gabriel, Fleetwood Mac, and Stevie Wonder to put down 50% deposits ($2500) on an LM-1.
Who recorded the samples for the LM-1
There were many stories about who recorded the samples for the LM-1, but in an August 15, 2017 interview for Reverb by Lou Carlozo, Roger Linn put all the rumours to rest.
It was a drummer named Art Wood, a good friend of mine with whom I had been in bands. It seems that no matter how many times I tell people that Art played the original drum samples, no one believes it, and they love to create myths. I’ve heard that the original samples were played by Steve Gadd, Jeff Porcaro, Dave Garibaldi, and others, but it was all Art.
by Roger Linn in an interview for Reverb
Art Wood played with Bette Midler, Tina Turner, Cher, James Brown, Gary Wright, Peter Frampton and Michael Penn
The LinnDrum (LM-2) released in 1982 was the less expensive version that had the most commericial success
Roger Linn Corporate History
Linn Electronics -1980-1986
Linn Electronics released the following products.
The LM-1- 1980
The LM-2 LinnDrum-1982
The Linn 9000 -Sampling drum machine with a 32 track MIDI hardware sequencer- 1984
The LinnSequencer- Rackmount 32 track MIDI hardware sequencer from the Linn 9000-1985
The LinnDrum MIDI Studio-A sampling drum machine, 32 Track MIDI hardware sequencer and a pad controller -1986
LinnDrum MIDI Studio Ad from 1986
AKAI MPC 60 and MPC3000
After Linn Electronics, Linn was contacted by the Japanese company Akai and worked to design the MPC60, an integrated digital sampling drum machine and MIDI sequencer released in 1988.
The MPC60 was followed by the MPC60 MkII and the MPC3000.
All the MPC products had a major influence on the development of hip hop and electronic music and the 4×4 grid of pads was adopted by numerous manufacturers.
Linn left Akai when the company went out of business in 2005.
AKAI MPC60 by Roger Linn-1988
AKAI MPC60 by Roger Linn-1994
Roger Linn Designs
In 2001 Roger Linn founded a new company, Roger Linn Design. With the help of Dave Smith and Tom Oberheim, Roger developed the company’s first product offering – the AdrenaLinn, a digital multi-effects unit combined with a drum machine and amp modeler.
The AdrenaLinn Series of Products
Select one of the 200 presets, play simple chords or arpeggios in time to one of the drumbeats, and listen as AdrenaLinn III transforms them into looped rhythmic patterns of spiked, swooped, swirled, chopped and mangled tonal variations, inspiring you to new and unexpected compositional ideas. Or simply use a tremolo, flanger or delay that moves in perfect sync to the beat. All the modulation effects you know and love now sync to you. The built-in compression keeps the filter peaks in check, and reverb tops it off.
It’s a sequenceable filter, it’s an amp modeller, and it’s a drum machine… it’s the AdrenaLinn, the new guitar processor from famed designer Roger Linn, best known for his classic drum machines and sequencing workstations.
Roger Linn and Dave Smith had always remained friends often sharing booth space and traveling to trade shows together. In 2011, they worked together and released the Tempest, a product that used Dave Smith’s vast knowledge of analog synthesis and Roger’s passion for drum machines, beat synched effects and sequencing to bring the world a new take on what a drum machine could be.
The Linnstrument Roger Linn and Geert Bevin 2015
The Linnstrument
At the Winter NAMM show in 2014, Roger introduced his newest design, simply called the LinnStrument.
Roger had teamed up with Geert Bevin, who has been a major propropent of MPE (MIDI Polyphonic Expression) and is also a member of the MIDI Association’s Technical Standards Board.
Roger was a guitarist and had always been frustrated with the limitations of MIDI and the fact that MIDI did not seem capable of expressing all the subtleties of expression that were possible with a six strings. Geert had already worked on many expressive controllers like Eigenharp.
Together their goal was nothing short of completely re-inventing the MIDI controller.
MPE created a way to get around some of MIDI 1.0’s limitations by using channel rotation so each note could be assigned its own channel and allowing Channel messages like Pitch Bend, Cutoff to be independently controlled for each voice in the Linnstrument.
In fact, Geert has described MPE as the bridge between MIDI 1.0 and MIDI 2.0 (MIDI 2.0 has Per Note Controllers and Per Note Pitch Bend).
The LinnStrument is velocity sensitive, grid based MIDI controller that senses three dimensions per finger, polyphonically. It is available in two sizes- 8 by 25 (in the case of the original Linnstrument and 8 by 16 in the case of the slightly smaller LinnStrument 128.
Linnstrument firmware engineer and MIDI Association TSB member Geert Bevin
For more information about Roger Linn and Roger Linn Designs, please see these excellent resources.
On May 26, we held the very first MIDI Live! chat with a panel of MPE specialists. We recorded the session and it is here as a podcast. Roger Linn demoed the Linnstrument for us.
Roger Linn forever changed the way people dance! As the inventor of the electronic Linn drum machine, he ushered in the new wave of electronic dance music beginning in the 1980s. The Linn drum machine also brought new meaning to the term “re-mix” and opened up a new era of sampling for club dj’s around the world. Roger worked closely with David Smith and others in the early efforts of MIDI in the 1980s. Roger has since created a host of successful products, including the AdrenaLinn guitar synth, which uses modern technology to bring some of his early concepts into the digital age.
Leon Russell was the noted musician and songwriter who contributed greatly to popular and rock music during his long career. As a studio musician, Leon was active in the development of the Linn Drum Machine having provided Roger Linn with several ideas to create new sounds, such as the hand clap and longer loops. He recorded extensively with other artists and on his own as a solo performer. He was also an award-winning songwriter who penned classic hits for himself as well as many other artists.
Roger Linn’s name may be forever associated with the birth of the first serious, sample-based drum machine in 1979, but now, after years away from instrument design, he’s back with a new guitar-oriented product offering rhythmic filter effects.
Previously dismissed as toys, drum machines soon had sticksmen running scared after the arrival of these two credible, powerful instruments in the early 80s
Tom Oberheim was born in Manhattan, Kansas in 1936.
In junior high school, he started building HiFi amplifiers for friends probably based on the same articles in Popular Mechanics that his contemporaries Bob Moog (1934) and Don Buchla (1937) were reading.
He was also listening to a lot of Jazz music and when he read an ad in Downbeat Magazine about an LA jazz club you could get into at no charge, he made the decision to save enough money to go to LA.
He arrived in LA in July of 1956 with $10 in his pocket.
Of course, he needed to find a job so he applied to be an apprentice draftsman at National Cash Register (NCR). NCR was a very large company at the time (but would soon become the Blockbuster Video of its era).
NCR were experimenting with a new brand new technology, computers. Of course at the time, a computer was not something you put on your lap, it was a room full of large bulky devices. But computers fascinated Tom and he continued to find jobs in the fledgling computer industry in Los Angeles. Tom Oberheim held a variety of part time jobs in LA between 1956 and 1969.
In 1966 he found a job where they would let him work part time and go to school so he enrolled in UCLA.
At UCLA, he started studying physics (like Don Buchla) and he would also spend time in the music department (like Dave Rossum).
That is where he met two people who would change the course of his life.
Joseph Byrd and Dorothy Moskowitz were in a band called “The United States of America” which had made a couple of records. When one of the band members quit and took his Ring Modulator with him, Dorothy asked Tom to make the band a new one.
The experience of designing a Ring Modulator and seeing how musicians used it to create unique tones led to Tom’s fascination with manipulating sound.
Music Modulator & RM-1
The first commercial product that Tom produced was the Music Modulator which was the result of combining Tom’s experiments with ring modulators with other circuitry to create a device that could be used in live performance. This product was used by a number of contemporary musicians including innovative trumpeter Don Ellis, and experimental composer Richard Grayson, who played the units with Tom in improvised live performance concerts. Tom was asked by the acclaimed film composer Leonard Rosenman to provide a ring modulator for an upcoming movie Beneath the Planet of the Apes (1970).
The Music Modulator was also sold by the Chicago Musical Instrument Company (later Norlin Corporation) under the Maestro name as the RM-1.
by Tom Oberheim Website
By 1969, Tom Oberheim was doing contract work designing effects processors for Maestro like the Maestro PS-1A Phase Shifter which was inspired by the Leslie speaker for Hammond organs.
Tom Oberheim Designed Maestro Phase Shifter
Having gotten the bug for the musical instrument industry, at the 1971 NAMM show Tom went to the ARP booth and asked Alan Pearlman if he could become the company’s first Los Angeles dealer for the ARP 2600.
Tom became very familiar with the ARP 2600 and 2500 synthesizers selling them to musicians like Leon Russell, Robert Lamm, and Frank Zappa. It is amazing how all the founders of the modern music production environment were interconnected and we will see more of those interconnections soon.
Tom realized that the ARP 2500 though more difficult to use then the 2600 could play two notes at the same time which the ARP 2600 could not do so he designed a modification to give the ARP 2600 two note polyphony. In those days of modular synthesis, two notes of polyphony was a really big deal.
Tom Oberheim’s practical experience with computer programming led to his next breakthrough product.
Oberheim DS2 Digital Sequencer
The DS2 Digital Sequencer was released in 1973 and was one of the very first digital music sequencers. As it didn’t make any sounds, it was a peripheral to existing modular synths in particular the ARP 2600 that Tom Oberheim was distributing in LA.
Tom Oberheim’s next product the SEM (Synthesizer Expansion Module) would be significant for several reasons.
It was the first Oberheim product to make its own sounds (not process sounds from another source).
It was also the first product to bear the Oberheim name.
Finally it was the first product that was a collaboration with another of the founders of the modern music production environment, Dave Rossum.
All of the pieces were starting to come together.
Using Dave Rossum’s technology for polyphonic keyboard scanning and Tom’s computer programming experience, Oberheim could now build portable polyphonic synthesizers with different modules for different purposes that would work together.
By 1977, Oberheim had released Oberheim 2 Voice, the Oberheim 4 Voice and the optional Programming unit.
You could buy a two Voice or 4 Voice synth with a 2 channel voltage controlled sequencer and if you needed Preset storage for use on stage, you could add a programmer module that would digitally store 16 different sound settings.
Finally there was company offering keyboard players something that had a keyboard, a programmable polyphonic synthesizer with preset program storage and a sequencer in all in one product.
But the design was bulky and each expander module had to be programmed separately unless you had the Programming unit which could only store 16 sounds.
The price was also pretty steep coming in at $5700 ($28,000 in today’s dollars!) with the programmer.
Plus you needed a mixer because each of the expander modules had a seperate audio out.
Soon the competition (Sequential Circuits) would catch up and lead frog everyone with the microprocessor driven, 5 voice polyphonic, programmable Prophet 5.
In ten short years, that same concept would add MIDI, ROM based samples and effects and be called the Korg M1 Music Workstation, but that is a story for another day.
Pictured below is a Oberheim Four Voice with the programmer module from the Horniman Museum
By Oberheim_Four_Voice_at_Horniman_Museum.jpg: Robert Brookderivative work: Shoulder-synth (talk) – Oberheim_Four_Voice_at_Horniman_Museum.jpg, CC BY 2.0, https://commons.wikimedia.org/w/index.php?curid=7509844
The Oberheim System (a precursor to MIDI)
In Chapter 5 of the History of MIDI, we look at the precursors to MIDI including Control Voltages and Gates, Roland’s Digital Control Bus (DCB), and the The Oberheim Parallel Bus introduced in 1980.
The Oberheim 37-pin D-SUB connector that allowed the OB-8, The DMX Drum Machine and the DSX Sequencer to connect together into the Oberheim System was one of the inspirations for MIDI and Tom Oberheim was the first person that Ikutaro Kakehachi from Roland reached out to about his dream of a universal digital musical instrument interface.
Only three short years after the Oberheim System was launched, Sequential Circuits and Roland would demonstrate MIDI at the 1983 NAMM show.
Oberheim Corporate History
Oberheim Electronics -1969-1985
Oberheim Electronics released the products listed above as well as the Matrix 12 and Matrix 6 in the early 1980s.
ECC Oberheim & Gibson buys ECC Oberheim -1985
By May 1985, Oberheim Electronics was struggling the company was taken over by Oberheim’s ex lawyer who promptly sold the company to the Gibson Guitar Company.
Two years later, Tom Oberheim departed the company he had founded and filed a lawsuit against his ex-lawyer for legal malpractice.
Marion Systems -1987-2000
In 1987, Oberheim formed Marion Systems (named after his daughter Emily Marion) and did consulting work for Roland and Akai. The company also released the Marion Systems MSR-2, a modular synthesizer concept.
Many musicians recording with their computer find that they need to buy — and wire up — separate mic preamps, guitar preamps and monitor mixers as well as their soundcard. Martin Walker tries out a products that aims to combine all of these elements in one system.
Tom Oberheim returned to hand-building and selling updated SEM synthesizers , but the Oberheim name is still owned by Gibson.
2016- Dave Smith and Tom Oberheim collaborate on the OB-6
Sequential OB6
Tom designed the VCO and VCF sections and Dave provided the arpeggiator/step sequencer, effects and production capabilities. After 30 years, two of the pioneers of modern synthesis were working together to design new products.
2019 -Gibson return the Oberheim Electronics name and other intellectual properties to Tom Oberheim
2022 -Oberheim Electronics reopened and Oberheim OB-X8 released
At the 2022 June NAMM show, Oberheim Electronics showcased the new OB-X8. What was supposed to be a joyous celebration was dampened by the news that just days before the NAMM show, Dave Smith had passed away. This is a picture from 2019 of Dave Smith, Tom Oberheim, Marcus Ryle (an engineer at Oberheim when still in his teens and founder of Line 6) and Roger Linn.
For more information about Tom Oberheim and Oberheim Electronics, please see these excellent resources.
Tom Oberheim is the inventor of the first polyphonic music synthesizer, who played a vital role in the establishment of MIDI standards back in the early 1980s. The Oberheim Company created a long list of innovative products, which remain sought-after as vintage instruments including the Oberheim 4 and 8 Voice instruments. In 1987, after the sale of the Oberheim Company, Tom established Marion Systems and later Seasound. In recent years Tom reissued one of his first products, an analog synthesizer he calls SEM, the Synthesizer Expander Module.
Marcus Ryle began his career in the industry as an engineer for Oberheim while he was still a teenager. In those pre MIDI days, Marcus designed a series of sound controllers as well as providing several modifications to the company’s synthesizers. He later co-founded Fast Forward in 1985 with Michel Doidic, which provided engineering for over 40 of Alesis’ products including ADAT. At the same time, Marcus and his team worked on their own guitar electronics, which eventually became the foundation of their new company, Line 6.
Acknowledgement of the people who made these articles possible
Before we dive into the history of the creation of MIDI, we wanted to acknowledge the key people who made this official history of the birth of MIDI possible.
These are people who were directly involved with the creation of MIDI in its early days. Some of these people have never been appropriately acknowledged for their contributions and that is one of the reasons for creating this detailed history of how MIDI came about.
Also it became clear in the two years of intense research that went into this article, that MIDI was always about connections and not just connections between products, but more importantly connections between people. There is the well known phrase “it takes village” and MIDI is a great example of how that is true. MIDI was not developed by two people or two companies, it was a group of individuals with different backgrounds and motivations who came together to do something for the greater good.
That said, we would be remiss if we did not acknowledge several key people who contributed first person resources and interviews to this article.
John Bowen– Head of sound design for Sequential Circuits, Korg R and D and now President of John Bowen Synth Design, maker of the Solaris synthesizer.
John provided insight into the interconnections between Moog and Sequential as well as multiple interviews on the early days of MIDI.
Hideki Izuchi– Roland Engineer
Izuchi-san created a report about the birth of MIDI that was presented in 2018 at the Japan National Museum of Science in Ueno Park, Tokyo. His incredibly detailed 72 page report entitled “Technical Systematic Survey on MIDI” provided much of the information for this article and confirmed many details from the Japanese side about the development of MIDI.
Jeff Rona– Composer and Founder and first President of the MIDI Manufacturers Association (MMA, now known simply as the MIDI Association)
Jeff provided the MIDI Association access to the very earliest communications of the MIDI Association in its infancy and of course as the very first President of the MIDI Manufacturers Association his contribution (along with Chris Meyer who worked at both Sequential Circuits and Roland and was the first chair of the MMA Technical Standards Board) guided MIDI through its formulative and frankly most turbulent early years.
In researching this article, we realized that the simple story that Roland and Sequential connected a Prophet 600 and a Jupiter 6 together at the 1983 NAMM show and that instantly MIDI became an overnight success was very far from the reality of what was happened between January 1983 and May of 1985 when the MIDI Manufacturers Association was formally created as a non-profit trade association.
Brian Vincek– Vice President at Hewlett-Packard and co-founder of the International MIDI Association
Brian provided access to all of the early files of the International MIDI Association (IMA). Brian and John worked closely in the very early days of MIDI and the IMA was responsible for the distribution of the initial MIDI specification to both MMA companies and individuals. His contributions to the very early days of MIDI were substantial and the files he provided for this article were invaluable.
In researching this article, we realized that the simple story that Roland and Sequential connected a Prophet 600 and a Jupiter 6 together at the 1983 NAMM show and that instantly MIDI became an overnight success was very far from the reality of what was happened between January 1983 and May of 1985 when the MIDI Manufacturers Association was formally created as a non-profit trade association.
Many other people and organizations including NAMM and AMEI provided access to their files and we thank everyone who contributed.
Thanks to those contributions, we believe that this article is the official definitive history of how MIDI got started.
Dave Smith, Bob Moog, Ikutaro Kakehachi, Tom Oberheim
NAMM International Music & Sound Expo
Chicago, Illinois
June 27-30 1981
Ikutaro Kakehachi, President of Roland first approached Tom Oberheim at the June 1981 Chicago NAMM show about the idea for an international industry standard for communication between synthesizers because Oberheim already had the “Oberheim System”, a proprietary system for connecting the Oberheim DSX sequencer, Oberheim DMX drum machine and either OB-8 or OB-Xa synthesizers.
Tom recommends that the Japanese contact Dave Smith from Sequential Circuits as well.
Sequential and Oberheim meet
Los Angeles
late summer 1981
Dave Smith first came down to Oberheim’s office in late summer of 1981 to preview his AES paper to Tom and me and discuss if it could be something Oberheim was interested in.
Although we had a parallel interface solution, we were interested in something that could be more affordable and more universal.
The next meeting Dave and I had was with the Japanese manufacturers during the Gakki fair in Tokyo in mid-October.
This was also before the AES show, and the USI proposal had not yet been released, so our meeting was in secret.
At that meeting there were good discussions about the speed (several of us of the opinion that 19.2Kbaud was too slow, etc.), the data messaging format, connector choices (1/4″ vs DIN vs XLR), and ground loop solutions
by Marcus Ryle
Gakki Fair
Tokyo, Japan
October 15 1981
Dave Smith’s Universal Synthesizer Interface (USI) proposal is discussed at Japan’s Gakki Fair.
Dave Smith, Tom Oberheim and Marcus Ryle met and discussed the proposal with Roland, Yamaha, Korg and Kawai and shared ideas and suggestions.
The Japanese companies agreed to meet about once a month to respond to and improve on Dave Smith’s initial design.
The 2nd Synthesizer Interface Conference
Tokyo, Japan
October 24, 1981
Mieda-san from Korg responds on behalf of the Japanese companies to the meetings at the 1981 Gakki Fair with Sequential and Oberheim and confirms the discussion that were they had at the Gakki Fair.
19.2kbps is too slow
¼” Jacks will have ground loop problems
There is no concept of synchronization, clock or the ability to start and stop sequences
The response is sent to Dave Smith of Sequential and forwarded to Tom Oberheim and Marcus Ryle.
Audio Engineering Society
Los Angeles
October 30, 1981
Dave Smith and Chet Woods AES Proposal for a Universal Synthesizer Interface
The Universal Synthesizer Interface is a specification designed to enable inter-connecting synthesizers, sequencers and home computers with an industry-wide standard interface. This is a preliminary specification; comments, criticism, and alternative proposals are welcome. This interface specification has not been tested and would need to be retrofitted to any equipment presently in the field. The interface is basically specified as one-to-one between two units; ie, a synthesizer and a sequencer. Under certain circumstances, however, more units may be placed on a single line.
Authors: Smith, Dave; Wood, Chet Affiliation: Sequential Circuits, Inc., San Jose, CA AES Convention: 70 (October 1981)Paper Number: 1845 Publication Date: October 1, 1981 Subject: Electronic Music and Musical Instruments
by Smith, Dave; Wood, Chet
Dave Smith and Chet Woods present a Universal Synthesizer Interface running at 19.2 kBaud, using regular 1/4″ phone jacks, but there are already on-going discussions about data speed, data format and different choices for connectors.
The 3rd Synthesizer Interface Conference
Tokyo, Japan
December 24, 1981
The Japanese companies propose some significant changes to USI.
Using a 5 PIN DIN cable or XLR cables (a locking cable was better for on stage use)
Adding a UART with grounding to prevent noise (design by Karl Hirano from Yamaha)
The Yamaha proposal shown in Fig. 3.6 was submitted by Katsuhiko Hirano, who was then Chief of LM Design at Nippon Gakki Co., Ltd. Tetsuo Nishimoto from the same department was in charge of drafting the plan.
The idea is to use an isolator to isolate the ground. A photocoupler is used for the circuit.
Figure 3.6
Roland proposed a slight modification of the hardware design and using a 5 PIN DIN which Roland was already using for DIn Sync although at the time, 5 PIN DIN. connectors were still rare in the US.
Fig. 3.9 Roland’s hardware proposal
Tadao Kikumoto, General Manager of the Roland Osaka Technical Center made suggestions for Tempo, Start, Stop, Forward and Backward messages and also introduced the concepts of Running Status, and Status Bytes and Data Bytes.
All of these were significant additions to the evolving MIDI Specification
Figure 3.10
Roland’s proposal for timing information and channel messages
An alternative specification was presented by some of the Japanese companies which had grown out of their own research.
Whereas the USI was basically content to specify note on/off codes, this new proposal went on to define many more complex operations. It also offered a different data structure, with status and data bytes being flagged by bit 7 O=status, O=data). This greatly simplified the protocol by eliminating all the checks which were otherwise needed to distinguish the data category. With the most significant bit now defined as a “flag,” data is thereby limited to 7 bits, but this is sufficient for most synth data, and when not, can simply be sent as multiple 4–bit nibbles.
by Stanley JungLeib in the Preliminary Prophet 600 Manual
MIDI Gets Its Name
Musical Instrument Digital Interface
At around the same time, there were discussions about the name of the new standard. There was concern that Universal Synthesizer Interface might cause some antitrust problems. The Japanese suggested UMI Universal Music Interface (You-Me) and Dave Smith countered with the name we all know today- MIDI or Musical Instrument Digital Interface.
Winter NAMM
Anaheim, California
Feb 5-7 1982
At the NAMM show in January 1982, a meeting with a number of synth manufacturers (Sequential, Roland, Oberheim, CBS/Rhodes, Yamaha, E-mu, Korg, Music Technology Inc., Kawai, Octave Plateau, Passport Designs, Syntauri and some others. but the companies could not agree on anything. The meeting did not go well. It was the American and European manufacturers who couldn’t agree. Some wanted expensive high data rate connectivity, others didn’t even see the point of an interface and so no consensus was reached and the meeting ended.
But after the meeting, Kakehashi-san from Roland sent Sakai-san to talk to Dave Smith encouraging him not to give up.
Dave explained it all in this 1997 video.
After incorporating changes in response to comments from AES, Smith sent a questionnaire to all manufacturers and industry consultants he could find, asking for their suggestions and any special requirements. There was a strong response to this initiative; some saying, for example, that it would not be possible to do it serially, that a parallel interface was necessary. Others thought the proposed serial speed too fast for operation with home computers. Many other issues were raised.
All respondents were invited to a conference in coincidence with the January, 1982 Western National Association of Music Merchants (NAMM) convention in Anaheim. This meeting was attended by representatives from SCI, Roland, Oberheim, CBS/Rhodes, Yamaha, E-mu, Unicord (Korg), Music Technology Inc., Kawai, Octave Plateau, Passport Designs and Syntauri.
Other manufacturers seemed to be maintaining a “wait-and-see” policy.
by Stanley JungLeib, Prophet 600 Preliminary Manual
Fax Communications between Sequential and Japanese companies
July 23, 1982
Sequential was planning on polyphonic synths with more polyphony. The original USI spec had 8 channels and at the time a channel equaled a monophonic voice. So originally the MIDI spec would have only allowed for 8 notes of polyphony.
Dave didn’t really want to tell the Japanese exactly what he was working on, but made some suggestions for improve the spec.
On July 23, 1982, a FAX was sent from Roland (serving as liaison for the Japanese companies) to Sequential Circuits agreed to:
Distinguish between polyphonic and monophonic content
increase the number of channels to 16 channels
Japanese companies agree to every channel being either monophonic or polyphonic and the number of channels is increased to 16. MIDI is finally ready for prime time
Jeff Rona starts working for Roland on MIDI
Summer of 1982
I was at a local music store in Hollywood and struck up a casual conversation with a couple of guys from Roland who happened to be there at the time. When I told them what I was doing with synths and desktop computers, they got very excited. Within a couple days I found myself in the office of Tom Beckman, the president of Roland US explaining my work and background. When he asked me if I wanted a job and could I write code for music software. I lied, basically, and said yes. I became a programmer and instrument designer for Roland that day.
by Jeff Rona, MIDI from the Inside
Brian Vincik connects with Sequential
Summer of 1982
Brian Vincik, an engineer from Hewlett Packard starts discussions with John Bowen and Sequential about MIDI and starts to form ideas about the International MIDI Association, an organization to promote the idea of MIDI to companies and the public
Bob Moog publicly announced MIDI in Keyboard Magazine
November, 1982
In a November 1982 cover interview with Keyboard magazine, Robert Moog announced publicly that Smith and Wood had been working on an invention: MIDI.
What is MIDI? Where did it come from? We offer an overview of MIDI’s early history and its evolution, including discussions of ZIPI, MPDL, OSC, MPE, and MIDI 2.0.
Brian Vincik provided with the MIDI Association with many files and one of the most interesting is the Prophet 600 Preliminary Manual written in December of 1982. It included a MIDI History that was deleted from the actual manual that was shipped with the product.
It is included here both as a PDF and as text which is easier to read.
MIDI HISTORY
Stanley JungJeib, SCI
Introduction
The Musical Instrument Digital Interface (MIDI) is a specification which enables manufacturers to design equipment that is basically compatible. This is most beneficial for the owner, whose equipment is thereby protected from obsolesence. As MIDI-compatible equipment is introduced, one will be able to freely choose keyboards, sequencers, and rhythm units from a variety of manufacturers with confidence that they will work together as one programmable system through which complete pieces can be composed and realized.
The problem of instrument compatibility is not new. It can be probably said of any two keyboards, that someone has desired if not actually tried to interconnect them. Keyboard couplers were developed for both pipe organs and harpsichords. In the heyday of electric organ technology this interest occasionally led to the installation of thick cables for wiring keyboards in parallel. The first synthesizers were easier to interface, because of the nature of modular equipment. However modules from different manufacturers might have incompatible control voltage, trigger, gate, and output levels or polarities. These differences have been promulgated in scores of synthesizer, keyboard, and effect devices, ultimately giving rise to an entire industry devoted to modifications and interfacing. And though they provide the best opportunity for interface so far, even microcomputer-based synthesizer equipment has been developed along independent, incompatible lines.
Like many other defacto “standards,” the MIDI has arisen primarily from the activities of those concerned that the incompatibility of current equipment discourages wider availability of the kinds of complex systems which can be envisioned utilizing even current technology. (The S-100 microcomputer buss evolved for similar reasons.) It is more than anything else the advent of the home computer which has forced music manufacturers to finally address the issue of compatibility. For the musician, the keyboard interface to the computer terminal offers the possibility of multi-track sequencing and editing, score display and printing. In this light the usefulness and need for a standard computer keyboard interface is obvious. Only with some such standard can these musical tools be developed.
The following explains how the MIDI specification resulted from this industry-wide consensus. The MIDI specification neither possesses nor claims any authority over equipment design. Rather, it is merely an informal agreement on some simple interface circuitry and the “grammar” of a non-proprietary language which can carry meaningful information between instruments. The incorporation or support of the MIDI facility in a product remains entirely a decision for each manufacturer.
GENERAL
The SCI Digital Interface
SCI first became interested in microcomputer interfacing in conjunction with the design of the Prophet-10 polyphonic synthesizer and its internal polyphonic sequencer. The Prophet and its sequencer each were based on Z-80 microcomputers. To record, as notes were played, every few milliseconds (at a rate set by the sequencer clock), the Prophet would send its complete keyboard “status” to the sequencer. The sequencer had to figure out which notes were going on and off, and record these events in reference to the clock count. On playback, the sequencer computer also sent the complete keyboard status every clock pulse, with events as counted out by the clock. The Prophet would play these notes just as if they came from its own keyboard. later, this sequencer was made available as an accessory for the Prophet-5. The Prophet-5 Remote Keyboard was also developed which used this interface. SCI published the data protocol upon which this interface was based, in the hopes that the programming public would be encouraged to develop their own interfaces for the Prophet-5.
This did not occur, apparently because in being conceived for a specific application, the interface was very fast but too clumsy for general-purpose use. It was criticized as requiring too much programming “overhead,” in the constant transmission of meaningless keyboard information. As a result of this experience, SCI resolved to pursue a more streamlined interface that would be easier for programmers to work with.
The Universal Synthesizer Interface
In the meantime, occasional discussions between the presidents of Sequential Circuits (SCI), Oberheim Electronics, and Roland (Dave Smith, Tom Oberheim and Ikutaroo Kakehashi) also revealed a shared interest in the interface problem and development of an interface widely acceptable to the industry.
Smith then outlined a specification for a “Universal Synthesizer Interface” (USI). It was developed with the assistance of SCI’s Chet Wood and presented at the Fall, 1981 convention of the Audio Engineering Society (AES).
The USI differed markedly from the earlier SCI Digital interface in that rather than being polled at the sequencer clock rate, information was only sent when an event actually occured–for example, a note going on or off. The USI was proposed to be serial, operating at 19.2 kBaud, with TTL levels, and connected through phone jacks.
After incorporating changes in response to comments from AES, Smith sent a questionnaire to all manufacturers and industry consultants he could find, asking for their suggestions and any special requirements. There was a strong response to this initiative; some saying, for example, that it would not be possible to do it serially, that a parallel interface was necessary. Others thought the proposed serial speed too fast for operation with home computers. Many other issues were raised.
All respondents were invited to a conference in coincidence with the January, 1982 Western National Association of Music Merchants (NAMM) convention in Anaheim. This meeting was attended by representatives from SCI, Roland, Oberheim, CBS/Rhodes, Yamaha, E-mu, Unicord (Korg), Music Technology Inc., Kawai, Octave Plateau, Passport Designs and Syntauri.
Other manufacturers seemed to be maintaining a “wait-and-see” policy.
At this meeting the chief changes which occured to the USI were to add optoisolation to prevent audio ground loops, and to increase the speed to 31.25 kBaud.
The Japanese Interface Proposal
Following the USI discussion at Anaheim, an alternative specification was presented by some of the Japanese companies which had grown out of their own research. Whereas the USI was basically content to specify note on/off codes, this new proposal went on to define many more complex operations. It also offered a different data structure, with status and data bytes being flagged by bit 7 O=status, O=data). This greatly simplified the protocol by eliminating all the checks which were otherwise needed to distinguish the data category. With the most significant bit now defined as a “flag,” data is thereby limited to 7 bits, but this is sufficient for most synth data, and when not, can simply be sent as multiple 4–bit nibbles.
The MIDI
After the Anaheim meeting, Smith and Wood integrated the USI and Japanese proposals, forming the first MIDI specification. This was sent to all of the meeting participants but, curiously, provoked no further comment from this continent.
The final document was therefore arrived at after several exchanges between SCI and Roland, which is serving as liason with Yamaha, Korg, and Kawai.
The development of MIDI was first made public by Robert Moog, in his October, 1982 column in KEYBOARD magazine.
In December of 1983, SCI began shipping the Prophet-600, the first commercially available instrument to include the MIDI.
Winter NAMM
Anaheim, California
Jan 21-23 1983
Here is the iconic picture of the first demonstration of MIDI between a Sequential Circuits 600 and a Roland Jupiter 6.
John Bowen is the person with his back to the camera wearing the red Sequential Circuits jacket.
Dave Smith said that they were not sure if the MIDI demo would work at all, but it did.
He also said he was late to the Roland booth. Security guards didn’t want to let him in because he didn’t have a NAMM property pass.
MIDI is not yet a standard and there is no official specification to share and no standards organization in either Japan or the rest of the world.
But MIDI was about to shake up the world for the next 40 years.
In our next chapter, we will look at the first tumultuous years of MIDI between the winter of 1983 and the Summer NAMM in 1985 when the MIDI Manufacturers Association was established.
Interview with John Bowen
Head of Sound Design for Sequential Circuits
In the last chapter of the history of MIDI, we covered the early history of electronic musical instruments, the period from 1900 to 1963.
By the mid 1960’s thanks to the work of Bob Moog, Alan Pearlman and Don Buchla, the concept of electronic Synthesizers, Drum Machines and Music Sequencers was well established.
A synthesizer is an electronic musical instrument that generates audio signals.
A music sequencer is a device or application software that can record, edit, or play back music, by handling note and performance information.
A drum machine is an electronic musical instrument that creates percussion sounds, drum beats, and patterns.
To really understand MIDI, we need to explain the way electronic musical instruments were connected together before MIDI.
Analog connections with CV/GATE
Before MIDI, the most common way to interface analog synthesizers was analog connections using CV (Control Voltage) and Gate.
With analog synthesizers, the pitch of the sound is determined by the height of the voltage. This voltage is called CV for control voltage. Whether or not to any sound is emitted is controlled by a voltage called Gate.
By Ashley Pomeroy – Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=83875018
In early modular synthesizers, each synthesizer component (e.g., low frequency oscillation (LFO), voltage controlled filter (VCF), etc.) can be connected to another component by means of a patch cable that transmits voltage.
Changes in that voltage cause changes to one or more parameters of the component.
This frequently involved a keyboard transmitting two types of data (CV and gate), or control modules such as LFOs and envelope generators transmitting CV data:
Control voltage (CV) indicates which note (event) to play: a different voltage for each key pressed; those voltages are typically connected to one or more oscillators, thus producing the different pitches required. Such a method implies that the synthesizer is monophonic. CV can also control parameters such as rate, depth and duration of a control module.
Trigger indicates when a note should start, a pulse that is used to trigger an event, typically an ADSR envelope. In the case of triggering a drum machine, a clock signal or LFO square wave could be employed to signal the next beat. The trigger can be a specific part of an electronic pulse, such as the rising slope of an electronic signal.
Gate is related to a Trigger, but sustains the signal throughout the event. It turns on when the signal goes high, and turns off when the signal goes low.
by Wikipedia
Analog Sequencers
Moog, Buchla and ARP had always included Analog Sequencer Modules in their modular designs.
Within a few years other companies including Roland and Korg were making similar products.
ARP Clocked Sequential Control credit: Alex Ball and jondent
Korg SQ10 Analog Sequencer
Roland System 100 Sequencer
The Challenge with Control Voltages
The problem with Control Voltage was that from the very start there were two different standards.
Volts per octave was popularized by Bob Moog in the 1960s.
One volt represents one octave, so the pitch produced by a voltage of 3 V is one octave lower than that produced by a voltage of 4 V.
Each 1 V octave is divided linearly into 12 semi-tones.
Companies using this CV method included Roland, Moog, Sequential Circuits, Oberheim, ARP and later the Eurorack standard from Doepfer, including more than 7000 modules from at least 316 manufacturers.
This convention typically had control modules carry the source voltage (B+, 5 V) on the ring of a TRS jack, with the processed voltage returning on the tip.
However, other manufacturers have used different implementations with voltages including –5 V to 5 V, 0 V to 5 V, 0 V to 10 V with the B+ possibly on the tip.
Hertz per volt was used by most but not all Korg and Yamaha synthesizers, represents an octave of pitch by doubling voltage, so the pitch represented by 2 V is one octave lower than that represented by 4 V, and one higher than that represented by 1 V.
The two implementations are not completely incompatible.
Connecting a Hz/volt keyboard to a volts/octave synthesizer will produce sound, but it will be completely out of tune.
Of course with voltages it was easy to modify things with a few parts and a soldering iron.
At least one commercial interface has been created to solve the problem, the Korg MS-02 CV/trigger interface.
But these interfaces weren’t really intuitive for musicians and there were a lot of cables involved.
The Roland Micro Composer MC 8
The Roland MC-8 released in 1977 was a seminal product. It was a microprocessor based sequencer that connected to an MC-8 Interface to send out analog voltages. So it it was a hybrid digital and analog system.
It was initially conceived by Ralph Dyck, a Canadian composer and Roland System 100 user.
Ralph went his his local Roland dealer in Canada who introduced him to Ikutaro Kakehachi, the President of Roland.
Kakehachi got Roland engineer Yukio Tamada involved who made the Micro Composer capable of multi-tracking (allowing multiple parts to be played simultaneously). The MC-8 could play eight notes at the same time so it was capable of complex chords.
Tamada-san also integrated the so-called ST/GT method -Step Time for sound length and Gate Time for note length.
This ST/GT concept was later adopted by other Japanese manufacturers.
The Microcomposer is also significant in the development of music production concepts because it was also the first time that a Time Base with resolution of a quarter note was introduced into the world of modular synths.
Many of the ideas that were developed for the Micro Composer would find their way into an industry standard musical digital interface that was on the horizon.
Ralph Dyck’s studio in 1977 with MC8 and Studio 100
Ralph Dyck and his son, Jeff in Japan with Ikutaro Kakehachi, President of Roland
Roland Synth Offices in Japan in 1976
Interview from the Vintage Roland MC-8 Sequencer Archive
In the late 1970s, musicians certainly had virtually no background in computers. What sort of reaction did they have to the “music by numbers” method of programming the MC-8?
Ralph Dyck: They didn’t like it much except Tomita and Steve Porcaro and Suzanne Ciani. Steve Porcaro did a lot of musically interesting parts with it.
When was the last time you worked with an MC-8, and what sort of production was it?
I think that the last time I used an MC-8 was with Toto, maybe for their album ‘Turn Back’.
The Oberheim System- OB-8, DMX and DSX
Marcus Ryle (who would later design the Alesis ADAT and then go on to found Line 6) was just 19 years old when he got a job at Oberheim in 1980.
He had been studying at UC Santa Dominguez Hills because they had just installed a recording studio with a synthesizer. Tom Oberheim came as a guest lecturer to the college and after a long conversation with Marcus hired him to work at Oberheim alongside Tom and JL Cooper.
Oberheim had just come out with the OBXa which had a 37 pin D-Sub connector with a parallel bus. Soon Marcus was working on the design for the DSX Sequencer.
By connecting an Oberheim polyphonic synth to a DSX and the DMX Drum Machine, you had a complete music system.
The heart of this system is the DSX sequencer which can control the whole family of Oberheim polyphonics – the OB-X, the OB-Xa and the latest model, the OB-8 via a computer interface. It also has 8 separate CV and gate outputs to control up to 8 analogue synths of the 1 volt per octave variety. It has a capacity of 6,000 notes and is capable of 16-voice polyphony. It can store up to 10 sequences at any one time and there is cassette storage for building up a repertoire of sequences. Each individual sequence can be independently recorded over 10 tracks and there are two recording modes: Real Time (with a 1/192 note resolution) or Quantize which will auto-correct your playing to ½ note (minimum) maximum or 1/32 note (demi-semiquaver) minimum. There is also a programmable metronome with an internal speaker.
by Paul Wiffen for Electronics & Music Maker, Future Publishing
You can even see by the back panel of a DSX Sequencer how intimidating musical instrument interfacing could be in the early 1980s.
Here is a recording from 1983 of what was possible with the Oberheim System.
DCB- Roland’s Digital Communications Bus
Roland’s DCB (Digital Communication Bus was a proprietary data interchange interface by Roland Corporation, developed in 1981 and introduced in 1982 in their Roland Juno-60 and Roland Jupiter-8 products. DCB only provide note on/off, program change and VCF/VCA controls. The DCB interface was made in 2 variants, the earlier one used 20-pin sockets and cables, later switching to the 14-pin Amphenol DDK connector vaguely resembling a parallel port.
The DCB was a Serial Interface that ran at 31.25Kbps the same rate as MIDI over 5 Pin Din.
Roland Juno 60 with DCB
Jupiter 8 with DCB
Roland Din Sync
To synchronize their sequencers and rhythm machines Roland developed a proprietary for syncing using a 5 PIN DIN Connector
In 1981, the stage is set for the MIDI revolution
By 1981, multiple companies were working on proprietary digital interface standards including Oberheim, Sequential Circuits, Roland and Yamaha.
But several companies were already starting to worry about the problems these proprietary standards would create.
Customers would be forced to choose one company’s system and be locked into only one way of making music or they would have to spend lots of money on interface convertors to connect gear from different manufacturers together.
A couple of companies started to dream about a universal standard for synthesizers and music production.
In the next chapter of MIDI history, we will finally to tell the full story of how MIDI got started, who helped to make it happen and the challenges that MIDI faced in its early days between 1981 and 1985.
Dave Rossum is another one of the founders of the modern music production ecosystem and had a unique relationship with several other key synth figures including Dave Smith and Tom Oberheim.
In fact, it was core technologies that Dave developed that allowed Oberheim and Sequential Circuits polyphonic synthesizers to be developed in the 1970s.
Dave was born in 1948 and grew up in the San Francisco Bay area. He “dropped out of high school” to attend California Institute of Technology and graduated in 1970 with a degree in biology.
After he graduated from college, he moved to Santa Cruz, Ca to study at the University of California Santa Cruz (UCSC) with the intention of getting a PhD.
His molecular biology professor, Dr Harry Knoller, was also an accomplished musician and had heard that UCSC had just received a new musical instrument, a Moog Model 12 synthesizer.
In his 2022 Synthplex presentation presented below, Dave described what happened next.
This was the moment when God took me by the nose and said “Over here, Dave”
by Dave Rossum on unpacking a Moog Model 12 synth at USSC in 1970.
Dave Rossum’s talk about the history of EMU at Synthplex 2022
If there is one definitive source about the history of EMU, this is it.
Special thanks to Michael Lehman Boddiker and Synthplex for arranging and recording Dave’s amazing presentation about his impact on the history of modern synthesizers.
In 1971, Dave and two of his friends from Cal Tech, Steve Gabriel and Jim Ketcham, formed E-mu Systems with the mission to build their own modular synths.
They were soon joined by Scott Wedge, who would eventually become president of EMU.
Legend has it that Dave and Scott flipped a coin and Scott lost so Dave became CTO and Scott became President.
EMU started building their own modular synths like the one pictured below.
EMU Modular Synthesizer
EMU also developed several core technologies that were instrumental (pun intended) in getting some other notable synth companies off the ground.
EMU designed and patented a digitally scanned polyphonic keyboard that was licensed by both Oberheim for the Oberheim 4-Voice and 8-Voice synthesizers and by Sequential Circuits for the Prophet 5.
EMU also worked with Solid State Micro Technology for Music (SSM) in designing chips that were used by many synth companies at that time.
1978 Data Sheets for EMU/SSM chips
The Emulator Sampling Keyboard is born
In 1979, Dave and Scott saw the Fairlight CMI and LM-1 from Roger Linn at the NAMM show. They both realized the potential for sample based instruments and the advantages that EMU had in being able to design their own chips.
According to some sources, there was discussion about licencing the Emulator technology to Sequential Circuits. However at around the same time, Sequential decided to stop paying royalties for the EMU keyboard scanning chip. This lead to a dispute between the two companies and also forced EMU to come out with the Emulator on their own.
This proved to be a wise decision on EMU’s part because the Emulator line of sampling keyboards-
Emulator (1981)
Emulator II (1984)
Emulator III (1987)
Emulator IV (1994)
set the direction for the company for the next ten years.
The list of artists that used the Emulator samplers is too large, but here is a summary from Wikipedia.
The Emulator II was popular with many musicians in the 1980s, such as early adopter Stevie Wonder, and was used extensively by Front 242, Depeche Mode, 808 State (on their 1989 album Ninety) New Order, ABC, Genesis, Paul McCartney, David Bowie, Herbie Hancock, Vangelis, Tangerine Dream, Jean-Michel Jarre, Yes, OMD, Stevie Nicks, Mr. Mister, and many more. The list is far from complete however as it became the staple sampler of just about every recording studio that could afford one in the 1980s, and thus was used on a multitude of albums at the time.
It even featured in the movie Ferris Bueller’s Day Off, where Ferris uses the Emulator II to play sounds of coughing and sneezing in order to feign illness on the phone.
by Wikipedia
The overall impact of the Emulator sampler on the history of synthesizers can’t be underestimated as it soon led to sample based instruments with samples stored in ROM memory that included Ensoniq products, the Roland D-50, the Korg M1 and many, many more products including sample based software and hardware products that are still sold today.
EMU Drum Machines
EMU released the first affordable ($999) drum machine with ROM based samples in 1983 (right on the cusp of the MIDI revolution).
They followed up with the SP-12 in 1985 which allowed users to create their own samples. The SP-12 was based on the Emulator ( and even had some parts that were interchangeable). The SP12 was also help start the trend of integrating sampling and sequencing together. This combination being able to create your own sounds and then sequence them helped fuel the hip hop revolution that was starting to happen right around the same time.
In fact, the SP-1200 (the 1987 follow up to the SP-12 with more sampling and sequencing memory (and MIDI) is considered one of the most influential products in the history of Hip Hop.
EMU and Creative Labs
In 1993, E-mu was acquired by Creative Technology, the Singaporean company that was focused on computer sound cards (with MIDI interfaces, of course). Products like the Creative Wave Blaster II and Sound Blaster AWE32 used the EMU8000 effect processor.
Also in 1990s, E-mu made many different sound modules based on the Proteus series which were rackmount MIDI tone generators. In 1998, Creative Technologies merged Ensoniq, another American synthesizer company they had acquired together with EMU.
From the late 1990s to 2011, Creative Labs continued to build sound cards using EMU technology. However Creative was in the brutally competitive, low margin business of PC peripherals. There were lawsuits with other companies (notably Aureal and Apple) which drained resources. As well the market for hardware peripherals shrank as computers became more powerful and software synthesis became more dominant.
In 2011, Creative Technologies shut down EMU.
Creative Technologies still sells products under the Soundblaster name.
The Sound Blaster product range has an audio upgrade solution – internal and external—for every setup. Don’t stop at stunning visuals. Redefine your audio experience with Sound Blaster.
Fortunately Dave Rossum’s contributions to the world of synthesizers didn’t end with Creative Technologies.
In 2015, Dave formed Rossum Electro Music and started creating new synthesizer products.
Some of these products go back to his early roots and take advantage of the renewed interest in modular synths created by the EuroRack format.
Other products are reissues of some of the most famous EMU products like the SP1200.
Like Alan Pearlman, Bob Moog, Don Buchla, Dave Smith, Ikutaro Kakehashi, Roger Linn, Tom Oberheim, and Tsutomu Katoh, Dave Rossum’s impact on the modern music production environment is not relegated to the past, but continues to evolve and shape the future of the way people make music.
From more information about Dave Rossum, EMU, and Rossum Electro, check out these excellent resources
Rossum Electro-Music creates uniquely powerful tools for electronic music production. Driven by the creative and technological vision of electronic music pioneer Dave Rossum, Rossum Electro-Music is the culmination of Dave’s 50 years of designing industry-defining instruments and transformative technologies.
A number of hi-tech music manufacturers are celebrating important anniversaries this year and next year. In the first of several articles on these companies, we look at the milestone products made by Emu, who drove the sampling revolution in the ’80s.
E-Mu Mastermind Dave Rossum belongs with Bob Moog, Alan Pearlman, Don Buchla and Tom Oberheim to the most important pioneers of American synthesizer-developers.
Dave Rossum and Scott Wedge attended college together and soon developed a few clever ways of combining their engineering training with their passion for music. They began creating sound controllers for the boom of synthesizers hitting the market in the 1970s. While supplying a growing number of electronic instrument companies in the San Francisco Bay Area, they decided to form their own company, E-mu. E-mu has been described as one of the cornerstone organizations that pioneered a number of products critical to the growth of electronic musical instruments – a list much too long for this bio.
Alan Robert Pearlman was born in 1925 (9 years before Bob Moog and 12 years before Don Buchla although he would outlive them both) and grew up in New York City.
Like many electronics buffs in the mid 20th century, he grew up making radios out of kits and schematics from Popular Mechanics.
He attended Worcester PolyTechnic Institute (WPI) in Worcester, Ma. It’s perhaps an odd coincidence, but Tom White (former MIDI Association President) and Pete Brown (current MIDI Association Executive Board Chair representing Microsoft) both attended WPI and current MIDI Association president Athan Billias grew up in Worcester.
Like Don Buchla, Pearlman did engineering work for NASA where he designed amplifiers for the Gemini and Apollo space programs.
Then in 1969, he founded ARP (based on his initials and the nickname he had as a kid growing up) and soon started producing the ARP 2500 modular synthesizer.
The Arp 2500
ARP 2500 photo From Alan Pearlman Foundation website
The ARP 2500 had two important differences from Moog and Buchla modular synths.
Instead of patch chords, the 2500 used a set of matrix switches to make connections between modules.
Perhaps even more importantly, Pearlman had found a way to solve a fundamental problem with the other modular synths. Because of their design, Moog and Buchla synths were sensitive to temperature (and changes in line voltage). Pearlman had a lot of experience in designing Op Amps from his time at NASA and he explained his solution to the problem which was to use dual transistors on a single integrated circuit.
“Bob Moog came up with a generator for logarithmic function and exponential function in different locations. They were not at the same temperature and would drift apart and get out of tune with each other. I saw papers by other engineers which showed means of stabilizing these functions by building constant temperature devices. It was much easier to simply put them at the same chip.”
by Alan Pearlman
Synthesizers take center stage in the 1970s
By the early 70s, there were a lot of famous bands that were using synthesizers. Rock bands like the Who were using the ARP 2500 on albums like Quadrophrenia and Tommy. Keyboard players from a variety of genres were expanding their sonic palette.
A great example of what was possible with a monophonic modular synthesizers is Elton John’s 1973 release “Funeral For A Friend”. The extended intro that song was created by the sound engineer on the album David Hentschel.
The way I used to work was to write charts out and then play monophonic parts on the ARP so I could play with one hand and adjust the gain and so on at the same time, to give it more dynamics. Playing polyphonically on analogue synths can give rather flat results. You don’t get any sense of movement. But if you write the parts out and then play them monophonically, then you get a lot more control.
by David Hentschel
The ARP 2600
In 1971, ARP released the ARP 2600 which had some significant improvements to the 2500’s design. The 2600 could use both patch chords and had matrix switches. The patch chords gave more possibilities, but the 2600 could be used without patch chords which made it easier to use on stage.
Another advance which would prove prophetic was to separate the keyboard and allow the keyboard to connected remotely to the modular synth by cable (just as MIDI would allow for remote keyboard and sound modules a few years later.
Edgar Winter figured out that he could put a guitar strap on the remote keyboard for the 2600 and wear like a guitar inventing the “Keytar”.
Edgar Winter and his ARP2600 used on “Frankenstein”
Arp Synths have a Close Encounter with Hollywood
The ARP 2500 in the 1977 film “Close Encounters of the Third Kind”
In 1977, not only were the sounds of the ARP 2500 featured in Steven Spielberg’s film “Close Encounters of the Third Kind”, but the 2500 itself and even ARP Instruments’ VP of engineering Phil Dodds (pictured here as the operator) were in the film.
ARP and Artists
ARP was one of the first companies to successfully use artists in magazine advertisements to promote synthesizers.
Artists including The Who, Stevie Wonder, Joe Zawinul, George Duke, Herbie Hancock and many others were featured in ARP ads.
It’s also important to point out that the Arp Soloist pictured above was one of the first synths to feature Presets.
ARP Corporate History
ARP was founded in 1969, by Alan Pearlman with $100,000 of his own money and support from a small group of investors.
ARP went public in 1973 and the company’s annual sales reached $7 million in 1977.
However the company invested heavily in the development of the the Avatar guitar synthesizer. Increased competition and falling sales figures led to a financial crisis and the company was liquidated in 1981.
Alan Pearlman passed away in Pearlman died on January 5, 2019, at the age of 93.
There have been many softsynth recreations of ARP synthesizers, but the TimewARP 2600 software re-creation of the ARP 2600 is the only software re-creation that Pearlman himself endorsed.
Recently Korg released reissues of several ARP products so ARP products live on.
His daughter, Dina R. Alcalay Pearlman established the Alan R Pearlman Foundation with the mission to celebrate the legacy of inventor, musician, entrepreneur and engineer Alan R. Pearlman, by making his innovative inventions publicly accessible, and inspiring future generations to imagine and create.
For more information about ARP, please check out the following links.
The fascinating story of an analog synths company and the instruments that made music history. Music by The Who, Elton John, Genesis, Weather Report, Frank Zappa.
Alan R. Pearlman was nicknamed “ARP” as a kid growing up in New York City, so it seemed the perfect name for a company when he was later designing electronic musical instruments. The first instrument created by Alan was the modular synthesizer known as the ARP 2500. The monophonic product was released years after the first Moog and Buchla instruments, but gained attention for several new features including the ever-popular function of not drifting out of tune, which was a common problem in the earlier products. Next came the now classic ARP 2600, and soon the company became a great leader in the growth and development of the electronic musical market.
At almost the exact same time that Bob Moog was starting to make modular synths on the East Coast, Don Buchla was starting to make modular synths on the West Coast at the San Francisco Tape Music Center.
Buchla was born in Southern California in 1937 and studied physics and music at UC Berkeley graduating with as a physic major in 1959.
He after graduation in the early 1960s, he worked on engineering projects for the Lawrence Radiation Laboratory (“Rad Lab”), NASA, and the California School for the Blind. As mentioned in other articles, Dave Smith and Brian Vincik first jobs were in the aerospace industry.
His first modular synth was commissioned by Ramon Sender and Morton Subotnick in 1963 with a $500 grant from the Rockefeller Foundation. Although the Moog and the Buchla 100 series Modular Electronic Music System were both modular synths, they were very different.
In fact, if you going to use one word to describe Don, different would probably be the right one.
His approach to oscillators was vastly different than Moog’s. He focused on more complex waveforms and methods for generating then then simple sawtooths, sines and triangles.
In fact, Don did not like to call his products “synthesizers’ because to him that implied that they were trying to synthetically recreate existing instruments. Instead true to the experimental background of composers like Subotnick and Rame, Buchla focused on creating new as yet unheard of sounds sounds and new a unique ways of controlling them.
buchla100 photo from 120years
Even the nomenclature used for Buchla’s modular components is unique, yet appropriately descriptive. Rather than an oscillator, filter, amplifier, and sequencer, Buchla’s instruments have a Complex Waveform Generator, a Multiple Arbitrary Function Generator, a Source of Uncertainty, a Quad Dynamics Manager, and so on.
by Buchla US Website
Buchla favored Touchplates over traditional keys
From the very beginning, Don Buchla had a design philosophy that made his products unique. He used capacitive Touchplates instead of standard keys. The advantage of Touchplates is that they provide a second dimension of expression. You can slide your finger up and down on the keys and generate control signals.
Don Buchla was a master at designing control interfaces. Every control based module was designed in a way that would challenge the musician to think outside the box. While he was not against standard keyboards, as most people theorize, he felt that the keyboard commanded the performer to think within a certain set of preconceived motions. The touchplate that we all know and love was originally based on an invention of Don’s from his days as a freelance engineer working for NASA; they were installed as fuel sensors in rocket fuel tanks.
by https://www.memsproject.info/
TouchPlate Image from https://www.memsproject.info/
If you had any doubt????
If you had any doubt that Don Buchla had a different approach to sound, synthesis and life in general, please read a bit of this advertisement (?) , manual (?), prayer (?) or perhaps just refreshingly crisp word salad copied from the Buchla US website.
In fact, it’s well documented that Don Buchla was well connected with Ken Kesey and the Merry Pranksters. Below is a picture of the Buchla Box that was used a PA and effects system at Trips Festivals.
It’s no mystery that Don was a known participant at the Trips Festivals in San Francisco, often helping with the sound and music aspects of the festival. The San Francisco Tape Music Center system was hauled to the festivals and used as a central nervous system/public address station where the MC could manipulate sound and address the audience with modulated voice and psychedelic effects to enhance the audio-visual experience.
by https://www.memsproject.info/
The Buchla 700 -Don’s first MIDI synth
Buchla 700 with Touchplates from VintageSynth.com
The Buchla 700 was a digital synth made by Don Buchla in 1987 and it had tons of MIDI control, a display with graphics and the classic Touchplates that Don is well known for.
Don was always looking to innovate to provide original tools for live artistic expression and by the late 1980s Don recognized the lack of original interfaces in the market and focused on designing unique and different MIDI controllers.
Don’s controllers fully exploited the possibilities of MIDI: Thunder read location and pressure of the fingers mapped to an ergonomic and artistic layout, the Lightning interprets gestures by 2 independent wireless wands, and the Marimba Lumina sees the strikes and location of 4 independent mallets.
All three could be programmed turn its recognized gestures into almost any combination of MIDI notes and controllers. And those three are only include a partial list of the controller ideas that were being worked on.
Don was always inventive and never stopped exploring.
Buchla Lightning Wand MIDI Controller
Buchla Thunder
Buchla Marimba Lunima- Copyright Joel Davel
For more information on Don Buchla, please check out the following websites
Alongside Bob Moog, Don Buchla is one of the founding fathers of synthesis, and yet much less is known of him and his instruments. With this two-part review of Buchla’s latest synth, and a history of some of his pioneering work, we hope to redress the balance…
Don Buchla grew up with a passion for music and a passion for engineering. When he combined the two loves, he created electronic musical instruments the world had never dreamed of before. His early
If you were forced to pick one single person who is responsible for the creation of the modern music production environment, Bob Moog would be a good choice.
He spans the era from the early days of synths to the post MIDI world and is arguably the most influential figure in synth history.
In researching the early beginnings of MIDI, we kept being surprised at how many times Bob’s name came up. But that is for a future chapter in the History of MIDI.
For now let’s focus on what Bob Moog was designing in the 1950s,1960s and 1970s.
Bob was born in 1934 in Queens, New York. His father was an engineer at Consolidated Edison. As young boy, he was fascinated by the Theremin.
At the age 14 in 1949, he built his first Theremin from plans published in the electronics magazine Wireless World (now Electronics World).
By the age of 19, he had started his first company RA Moog and was selling Theremins and Theremin kits based on his own design which used transistors. One of his customers was Raymond Scott. You can learn more about Raymond Scott and other early synths before 1963 including the Theremin by following the link below.
The first electronic musical instruments As electricity became more widely available, the early 20th century saw the invention of electronic musical instruments including the Telharmonium, Trautonium, Ondes Martenot, the Theremin and the Hammond
In 1963, Robert Moog published what is one of the most influential AES papers every released- Voltage-Controlled Music Modules.
In this incredibly short and concise 9 page paper, Bob lays out the foundation for millions of synthesizers that came after and many that are still popular today.
He describes seminal ideas and schematic diagrams for a Voltage Controlled Oscillator, Voltage Controlled Low Pass, Band Pass and Hi Pass Filters and the simple control mechanisms for playing notes on a keyboard.
The motivation of the present work is the premise that the electronic music composer will benefit having at his disposal a sound apparatus which he can easily understand, quickly set up, and “play” spontaneously, more in the manner of a conventional musical instrument than of a code-controlled apparatus. (This premise has yet to be tested at length).
The system to be described consists of (a) voltage-controlled signal generating and processing modules and (b) a variety of transducers designed to produce voltages proportional to the position, velocity, and force of the musician’s hands. Particular stress has been placed on attaining a linear variation of the properties of the modules with respect to the control voltage’s magnitude, a feature which enables the modules to be programmed according to simple rules.
CONCLUSION
A group of basic audio signal generating, amplifying, and filtering modules has been described, The salient variable of each module is proportional to a control voltage over a range wide enough to insure utility in the production of electronic music. Specialized modules, such as noise generators and ring modulators, can obviously be used with the basic modules.
Several control transducers, patterned after the control mechanisms of conventional musical instruments, have been used for the sake of expediency.
The simple and predictable relation between the applied control voltage and the salient variable of each of these modules suggests their application in fields other than electronic music production. In particular, the setting up of prototype experimental electronic musical instruments, and the remote-control processing of conventional audio signals are ideal applications for the voltage-controlled modules.
by Bob Moog, AES Paper 1964
Bob Moog, AES Paper 1964
Soon with the help of composer Herb Deutsch, Bob was selling soon Moog Modular synths to rock stars and studios.
Perhaps the one album that brought the synthesizer to mainstream worldwide attention was Wendy Carlos’s Switched-On Bach released in 1968. Suddenly everybody new two new words -Moog and Synthesizer.
By the late 1960s, it was adopted by rock and pop acts including the Beatles, the Doors, the Grateful Dead, and the Rolling Stones. Progressive Rock acts like Yes, Tangerine Dream, and Emerson, Lake & Palmer made it even more popular.
Courtesy of Moog Music Website
Moog soon added a new module to his Moog Modular synths. The Moog 960 Sequential Controller, a Sequential Sequencer which provided a bank of 8 control voltages that could be used for any purposes including playing back a set of notes with pre-determined pitches. This was one of the first portable music sequencers.
The Moog 960 Sequential Controller
The Mini Moog- the first portable and affordable synth
Photo by Andrew Russeth
The Minimoog is an amazing story, partly because it highlights something well known about Robert Moog.
He was a brilliant inventor, but not a great businessman. He pursued the development of his synthesizer as a hobby and admitted when he started the business he no idea what a balance sheet was.
The Minimoog was developed as a side project by Moog engineer Bill Hemsath and Moog staff in 1970 who were afraid they were going to lose their jobs. At first, Bob Moog even opposed its development.
But he relented and the Minimoog became a huge success and it was the very first synthesizer sold in retail music stores.
It became an instant hit because it allowed keyboard players to finally compete with guitarists in taking solos.
Corporate History of Moog Music
Here is a brief summary of the history of Moog products
1953-1971 RA Moog Company
Bob Moog founded RA Moog company at the age of 19 and started by selling Theremins and Theremin Kits.
The company expanded and started creating modular synths in 1964. However these large modular synths were mainly used in recording studios and universities so had a limited market. Between 1953 and 1971, the company was only profitable in the year after the release of “Switched On Bach” in 1968 and ran deeply into debt.
1971-1973 muSonics, then Moog Music Inc.
As a result of the debt, in 1971 Bill Waytena, owner of a company called muSonics bought Moog Music from Bob Moog in 1970.
Waytena had created Musonics with the dream that he could market synthesizers as a home entertainment products. It is a cautionary tale repeated over and over again in the musical instrument industry when companies try to move from the professional musicians market to a broader consumer market and overreach their resources.
1973- Moog Music Inc is purchased by Norlin Music
Norlin Music started in 1920 as Chicago Music Instruments company, the musical instruments distributors that owned the majority of shares of Gibson guitars and Lowrey organs. In 1969, Chicago Music Instruments was acquired by ECL, a South American beer and cement company and the two companies were merged under the name, Norlin.
1977- Bob Moog leaves Moog Music to found Big Briar
1978–1987 Mismanagement by Norlin and competition lead to Moog Music’s bankruptcy
Between 1978 and 1987, Moog Music came out with many products, but none achieved the success of the Minimoog.
The company (Norlin) started to do contract engineering work including subway repair systems, air conditioning systems and even an air hockey table.
Leave to it to two Harvard Business School graduates to take an iconic musical instrument brand and turn it into an air hockey table.
The new company was renamed Norlin Corp (a portmanteau of the names Norton Stevens of ECL and Arnold Berlin of CMI; Arnold Berlin, Maurice’ son, and Norton Stevens were friends and classmates at the Harvard Business School).
by Wikipedia
Big Briar 1977-2022
After leaving Norlin Music in 1977, Bob moved his family to just outside of Asheville, North Carolina. He then started experimenting and designing new products under the Big Briar name.
Pictured here are the 100 Series keyboard controller featuring XY Touch Sensitive keys, the 300 Series XY Touch Pad and a Big Briar Theremin like controller.
Bob Moog and Big Briar continued to develop products including a the Ethervox MIDI Theremin in 1998 and a series of pedals including the MoogerFooger low pass filter.
2002 – Bob wins a Technical Grammy and gets back the Moog Music name
2022 was a very special year for Bob because he won a Technical Grammy and because he was able back the rights to Moog Music name and logo.
Robert Moog Technical Grammy 2002
Moog Music Logo
2005 Dr. Robert Moog passes away at age 71
2006 Bob Moog Foundation is founded
Bob Moog’s impact on the development of MIDI
In upcoming chapters of the history of MIDI, we’ll document how Robert Moog was involved with the development of MIDI in some unique and important ways.
But first, we are planning a series of articles on the founders of the modern music production environment with articles about Alan Pearlman, Don Buchla, Dave Smith, Dave Rossum, Ikutaro Kakehashi, Roger Linn, Tom Oberheim, and Tsutomu Katoh.
However, it seemed fitting to start our series of articles with Dr. Robert Moog for all he did for anyone who has every enjoyed playing synths and creating sounds.
For more information on Bob Moog, please visit the following websites
The Bob Moog Foundation, a small 501(c)(3) non-profit organization, carries Bob’s legacy forward to future generations. INNOVATE, INSPIRE, IGNITE CREATIVITY
Bob Moog’s name is forever associated with the synthesizer — but why? We take a trip back in time to explain the story of the man and the modular systems that provided the basis for nearly all modern synths.
Dr. Robert Moog was the father of the synthesizer and perhaps the best-known promoter of the Theremin and electronic music. When he passed away in 2005 after a short illness, he was eulogized as an inventor and lover of music. When his Modular Moog was introduced in 1965, followed by the Minimoog in 1969, he forever changed the range of tone in modern music, and many would say its attitude as well. The synthesizer celebrated the two things Bob loved most, electronics and music. Before Bob, the idea of electronic music was toy like; today, it is a way of life.
The Register posted an article today about Firefox supporting Web MIDI.
MIDI was created by a small group of American and Japanese synthesiser makers. Before it, you could hook synths, drum machines and sequences together, but only through analogue voltages and pulses. Making, recording and especially touring electronic music was messy, drifty and time-consuming. MIDI made all that plug-and-play, and in particular let $500 personal computers take on many of the roles of $500/day recording studios; you could play each line of a score into a sequencer program, edit it, copy it, loop it, and send it back out with other lines.
Home taping never killed music, but home MIDI democratised it. Big beat, rave, house, IDM, jungle, if you’ve shaken your booty to a big shiny beat any time in the last forty years, MIDI brought the funk.
It’s had a similar impact in every musical genre, including film and gaming music, and contemporary classical. Composers of all of the above depend on digital audio workstations, which marshall multiple tracks of synthesised and sampled music, virtual orchestras all defined by MIDI sequences. If you want humans to sing it or play it on instruments made of wood, brass, string and skins, send the MIDI file to a scoring program and print it out for the wetware API. Or send it out to e-ink displays, MIDI doesn’t care.
By now, it doesn’t much matter what genre you consider, MIDI is the ethernet of musical culture, its bridge into the digital.
The Register Post was inspired by this Tweet from the BBC Archives.
#OnThisDay 1984: Tomorrow’s World had instruments that sounded exactly like different instruments, thanks to the magic of microprocessors. pic.twitter.com/wbhm14WakD
Tapis Magique is a pressure-sensitive, knitted electronic textile carpet that generates three-dimensional sensor data based on body postures and gestures and drives an immersive sonic environment in real-time. Demonstrating an organic and expressive relationship between choreography and music has been a never-ending feat in the performance arts, as seen in previous work by Cage and Cunningham, Horst and Graham, or Stravinsky and Balanchine. Our work unveils dancers’ creative, unconventional possibilities of agency, intimacy, and improvisation over the music through a textile interface.
Motivated by the craftsmanship and connections of cultural textiles such as Javanese Batik or Balinese Ikat to their traditional performance arts, we began to apply an artistic approach into technological textile design and merge new materials, sensing technologies, and digital fabrication with contemporary dance and music into one united and harmonious piece of object and performance.
The tapis design is composed of multi-layer knitted textiles. The top and bottom layers are orthogonal conductive line matrices knitted within a single operation using multi-material twisted yarns. The middle layer is a knitted piezo-resistive textile, a pressure-sensitive layer that interfaces with the conductive matrices to create a sensing grid. The dense geometrical patterns of the stars scattered around the brushstroke details in the tapis represent 1800 pressure-sensing pixels (distributed in 15 MIDI Channels) and are inspired by the galactic space. Parametric design transformed these patterns into a 3-D spatial illusion to illustrate the multi-dimensionality of the sensor data.
The knitted conductive lines are connected to a system hardware consisting of multiplexers, shift-registers, operational amplifiers, and microcontroller that sequentially reads each pressure sensing pixel and sends it to a computer. These pixels collectively generate continuous 3-D spatiotemporal sensor data mapped into MIDI streams to trigger and control discrete notes, continuous effects, and immersive soundscapes through science-inspired musical tools.
Several musical pieces were designed to invoke various emotions to inspire conversations between the choreographer and the instrument. Behind the scenes, runs a digital modular synthesizer made of several patches, each for a different type of performance. The incoming stream of MIDI data is first fed into quantizer modules that align the notes to major, minor, and pentatonic scales, as well as mystic chords. The sounds are then generated by a collection of subtractive, additive, and granular synthesizers. “Venus Sunrise”, one of our performance pieces, as shown in the video above, presents a metaphorical celestial sound of the universe as the dancer is twirling around the stars, traveling through space and time.
Tapis Magique demonstrates the interplay between art and technology, highlighting the deep emotional link between contemporary textiles, dance, and music through the physical-digital connection. It provides a canvas for dancers and sound artists to modulate sound, perform and compose a musical piece based on choreography and vice versa; it also creates an auditory-gestural synesthetic environment that invites and encourages audiences to interact and express themselves with the tapis, experiencing a magical connection that stimulates the body and mind.
Some of you may be aware that Don Lewis passed away the week of Nov 7, 2022. Don was a synthesis pioneer who created a hardware platform for controlling banks of Synthesizers (Oberheim, Arp 2600s) prior to MIDI, which he called LEO (Live Electronic Orchestra). Mr. Kakehashi of Roland worked with Don starting in 1969 and credits him as an inspiration for what became MIDI 1.0. I asked his wife Julie Lews if she had some thoughts about what Don would want MIDI Association Members to know about him. Here are her thoughts.
Don was often thinking about things that others couldn’t or didn’t want to think about because he was always pushing the envelope especially when he was beta testing new products. It seemed like he always was asking “what if” and reaching for more. Mr. Kakehashi loved his free thinking but was always trying to rein him in when it came to market products understanding the average musician didn’t need so much.
Regarding MIDI, Mr. Kakehashi came to the dedication when LEO first went to the NAMM Museum of Making Music in 2001. In an interview with the museum he said that “Don Lewis was the inspiration for MIDI.” He and Don had been collaborating since 1969. Don conceived of LEO and drew plans in 1974 and in 1977 the components of LEO were physically interfaced by Richard Bates over the course of 3 months in Denver. Mr. K met LEO probably in early 1979 and afterwards made frequent trips to San Francisco with his engineers to take in the magnitude of what Don had accomplished with creating “central controller” keyboards for the ARP 2600’s, the Oberheim SEMs, the Hammond Concorde, effects and even more powerful the ability to mix everything on the controller panel. The Promars on the bass pedals and the huge role of the JP-4 were the toppers to the setup. When David Smith and Ikutaro Kakehashi received the Grammy for MIDI, Mr. K called Don and invited him to be at his side for the events as he credited Don with having given him the vision. At the last minute, due to health concerns, Mr. K couldn’t make the trip and sent his son Ikuo to receive the Grammy on his behalf. Ikuo told Don that in Japanese part of their family name “hashi” meant “bridge” and that their family considered Don to be a “hashi.” Mr. Kakehashi talks about this in the documentary, “Don Lewis and the Live Electronic Orchestra.” donlewisleo.com
When FM hit the scene and Don was brought into Yamaha for sound design and presentations, one of his big accomplishments was to create an excerpt from the Saint-Saens Symphony #3 final movement (minus the organ which he played live during the performance) for the 1985 presentation to the dealer network. He did it on the QX-1 and the TX-816 and it took several months to enter the score using both real and step time. Although Don seldom performed live with sequencing, this was a huge accomplishment that showcased a new power of MIDI for sequencing and recording. Gary Leuenberger also created sequences for the show and they played live on top. The Yamaha engineers were blown away with tears in their eyes and said they had never dreamed their products could do all this. It was an incredibly exciting time when the power of MIDI was coming to light. Don did a lot of recording and exploration with Jim Miller’s program, Personal Composer, being used by Yamaha in the early days.
Hans Zimmer is one of the most famous and prolific film composer in the world.
He has composed music for over 150 films including blockbusters like The Lion King, Gladiator, The Last Samurai, the Pirates of the Caribbean, The Dark Knight, Inception, Interstellar and Dunkirk.
In a recent interview with Ben Rogerson from MusicRadar, this is what he said about MIDI.
MIDI is one of the most stable computer protocols ever written.
MIDI saved my life, I come from the days of the Roland MicroComposer, typing numbers, and dealing with Control Voltages. I was really happy when I managed to have eight tracks of sequencer going. From the word go, I thought MIDI was fabulous.
by Hans Zimmer for MusicRadar
To read the whole article, click on the link below
How MIDI helped create one of the most innovative albums in Venezuelan musical history by transcending epochs, distances, styles and production strategies
By Bartolomé Díaz & Félix Carmona
Caracas, Venezuela
Venezuela was, and still is, and extremely musical land, a fact documented by such illustrious travelers as Alexander Von Humboldt upon his arrival on what was still a Spanish colony in 1799. The country gained its independence in 1821 and, as it is always the case, began the process of developing its own artistic expression, an endeavor that would signal much of the remainder of the century. Musicologists have come to the conclusion that the years 1860 to 1880 were essential to Venezuela´s autochthonous musical language, a style that successfully concocted influences from Europe, Africa and native folk expressions. One superb document of those times is the Cuaderno de Piezas de Baile de Varios Autores, known also as the Quíbor Manuscript and more recently as the Quíbor Real Book (Quíbor being the city where this monumental collection was compiled and copied by Pablo Hilario Giménez, a local musician and intellectual, through the space of several decades). Although Switched-On Q owes enormously to this extraordinary source (the entirety of the music presented in the album comes from the Quíbor Manuscript) MIDI was the determining force behind the project, stimulating our production team to think outside the box and come up with bold solutions to convincingly back up what appeared to be a pioneering musical production within the Venezuelan market.
Switched-On Q has turned Vintage into Vanguard.
Félix Allueva Rock Historian, Concert Promoter, Educato
Had there been no COVID-19 pandemic there is a good chance that Switched-On Q would still be a somewhat vague idea in the mind of its authors. Necessity breeds creativity and there is no doubt that the planet´s forced confinement sparked our need to circumvent traditional production notions and preconceptions. Creating arrangements of XIX century Venezuelan salon dances that remained faithful to the spirit of the music, that worked well during a live MIDI recording and that produced the precise data that would stimulate cutting-edge analog synthesizers to give a sparkling rendition of the selected repertoire proved to be a delicate task, as we shall see.
Experience by trial and error was an essential part of the project´s initial phase. Any signal travelling between a guitar and an analog synthesizer, whichever way they are connected, requires the string instrument´s resonance (open strings, movements from string to string) to be precisely tamed and controlled, something that is quite the opposite to the accepted general interpretation on plucked string instruments that are themselves generating their own sounds. Although the MIDI recording software was giving us a clear image of the problems ahead, it took a complete revision and makeover of the album´s arrangements to completely avoid sound overlaps and thus obtain crystal-clear performances that could really set the synthesizers in motion.
A totally unexpected point of departure and marvelous performances have produced an arresting and fascinating result.
Juan Francisco Sans Pianist, Composer, Musicologist
MIDI was, unquestionably, the Esperanto that connected this project. As soon as the MIDI recorded guitar parts were fed to the five analog synthesizers that were to give life to these old Venezuelan dances, magic happened. The music stopped being Late Romantic Nationalism and became timeless, the effervescence of the synthesizers blessed the performance with an almost palpable sense of playfulness and joy. Brows in two different parts of Caracas, Venezuela, turned up while smiles followed suit, images of Wendy Carlos and her prototype Modular Moog haunted us, reassuringly, for days.
The MIDI recording sessions for Switched-On Q began in October 2020 and were conducted by Miguel Miki Perea Antonetti at MIKOTT Studios. As a general rule two dances (each consisting of five individual guitar parts) were recorded in each session and given a general revision for overlaps and other editorial details. The MIDI data would then be emailed to Félix Carmona who would carry out a second revision and proceeded with final adjustments, still within the MIDI vocabulary. Once this stage was concluded it was time for the pieces to return to the realm of audio. The dances, by now exquisitely dressed by the synthesizers, were able to display their considerable grace, elegance, sensuality and flair, not to mention the unmistakable nationalistic character that some of them so proudly exhibit. The final phase of the project was, of course, tackled in a more traditional way, the selected performances of the synthesizers were lovingly mixed and passed on to Latin Grammy Award Winner Leonel Carmona (Mexico City) for mastering, the image for Switched-On Q was created by BID Award Winner Zilah Rojas (Caracas). By the time June 2021 arrived the COVID-19 pandemic was hitting Venezuela even harder than before (not to mention it being merely a part of our country´s social and political very complex reality), and yet, an album as optimistic, illuminating and profoundly Venezuelan as Switched-On Q was ready to take the stage.
Díaz and Carmona have made past, present and future coexist in the most harmonious fashion.
Arturo Gutiérrez Plaza Author, Poet, Educator
The beauty of a project such as Switched-On Q is that it multiplies the pleasure and responsibility of musical performance by two. Each artist has not only to do his very best but has to implicitly trust the other to also do his best and to maintain a firmly convinced and cohesive creative unit. MIDI enables this independence / dependence in an very effective way allowing musicians to transcend epochs, distances, styles, production strategies and, last but by no means least, pandemic states. We sincerely hope you enjoy this album.
Bartolomé Díaz and Félix Carmona teach at the well known Universidad Metropolitana UNIMET in Caracas, Venezuela. Their MIDI-fueled Switched-On Q was conceived and produced between 2020 and 2021. The album debuted to unanimous national acclaim in July 2021.
bdiaz@unimet.edu.ve / fcarmona@unimet.edu.ve
Switched-On Q vindicates the fact that traditional music can be thoroughly, intelligently and sensitively re-dimensioned: it is an effort for which we should all be thankful.
Juan Carlos Ballesta & Leonardo Bigott Ladosis Music Magazine
Switched-On Q will, in time, become a bona fide Venezuelan cultural reference.
Carlos Poletto Singer-Songwriter
One can immediately sense the enormous work, effort, imagination and inspiration behind this lovely album.
Judith Akoschky Author, Educator, Didactics of Music Specialist
A totally convincing Back to the Future with an Analog Orange Clock set to the Venezuelan salons of 1880. A true first in our country´s recording history.
Easily connect your MIDI with Unity, Max/MSP and openFramework – and control them all at the same time
EasyController is a standalone virtual tool for live performance. We use it in our audio-visual performance to map our midi controllers and presets while we are juggling between Unity, Max/MSP, and openFramework.
For example, it enables us to control our visual in Unity and the audio in Max/MSP, allowing back and forth communication between the software and the Midi controller using a virtual representation.
This gives us a standard and easy way of programming the midi gestures, and importantly to focus on the creative development and enjoy the live performance.
EasyController is available for free on MacOS and can be downloaded from our website – 42noir.com/es
Ólafur Arnalds didn’t start out playing keyboards. He started out as a drummer in hard rock bands. He is not alone. Yoshiki from the legendary Japanese hard rock band X Japan comes to mind. Many people forget that the piano is classified as a percussion instrument along with marimbas and vibraphones.
He has unique approach to music that combines technology and a traditional almost classical approach to composition. He also is one of the few people still using the MOOG Piano bar, a product developed by Bob Moog and Don Buchla (now discontinued) to turn any piano into a MIDI device.
Photo: Richard Ecclestone
What’s behind bleep bloop pianos
In many interviews, Ólafur says that his acoustic pianos bleep and bloop.
In these two Youtube video, he explains how MIDI technology is a core part of his creative process. What is is interesting is how organic and emotional the resulting music is. The technology nevers get in the way of the art and only compliments it.
This video explains how the three acoustic pianos are connected by MIDI.
I am in constant search of new ways to approach art with technology, interaction and creativity.
by Halldór Eldjárn
Halldór Eldjárn is another Icelandic artist who worked on the All Strings Attached project and developed some robotic MIDI instruments for the project.
Ólafur Arnalds on NPR’s Tiny Desk Concerts
To see a complete performance of this unique use of MIDI processing, listen to this performance on NPR Music Tiny Desk Concerts.
How to Bleep (and Bloop) yourself
Arnalds has released a library of sounds for Spitfire Audio recorded at his studio on his ‘felted’ grand piano along with added content in the Composers Toolkit.
Recently MIDI Manufacturer Association member Blokas released the Midihub, a MIDI router and processor. In our article on the MIDIhub, Loopop explains how to use the Midihub to create some Olafur Arnalds inspired MIDI effects of your own.
At SXSW 2019, Moritz Simon Geist performed and presented several workshops on using robots and MIDI. His new EP is created completely with MIDI controllers controlling robots he created himself.
A latency control concept for midi driven mechanic robotic instruments
Geist is deeply into MIDI. His blog details a proposal for how to overcome the latency caused by physical movements of robots using MIDI and Cycling 74′ Max.
After seeing the energetic guitar playing of Muse’s Matt Bellamy, I wanted to give electronic musicians a tool to achieve similar playing style with high visual impact. Easily, without years of training. And make them move.
What’s in a name? Apparently, a lot.
Chara is Greek noun which describes a feeling of inner gladness, delight or rejoicing. Kara rhymes with Finnish word for guitar, “kitara”. Of course, there’s karaoke.
Reinventing The Wheel
In early concepts player would start a sound by rotating a wheel. Rotational speed would define the velocity for the sound. The rotation could be stopped by palm muting. The direction would define MIDI channel.
As the note layout of a guitar fretboard is quite complicated for many, the conventional keyboard layout was copied and mirrored.
Scratch That
After numerous design iterations, coding and testing sessions, breadboard connections, deformed 3D printed parts, PMMA fumes from laser cutter, layers of paint, wasted adhesives, PCB rats net corrections, capacitive touch calibrations and CNC machining hours… Kara prototype was finally ready in May, 2018.
Trigger Happy
The notes are selected from fingertip-sized pits. The prototype has a four-octave Pitboard.
With Note Triggers, the selected notes are played by strumming or tapping.
Strummed notes are sustained indefinitely. There’s no need to touch them any longer; player can freely select new notes without affecting the strummed ones. If nothing has been selected from the Pitboard, strum action repeats the previously selected notes.
When tapping, player touches one or more Note Trigger(s) and the selected notes are played via the touched channel(s). If there are strummed notes playing on the channel, tapping stops them.
When Motion Trigger is touched, data from motion sensor is read.
Touching a Note Trigger, selects the associated MIDI channel.
The note layout was designed for easy memorization and for effortless selection of basic chords.
Command and Control
The usage of Note Triggers made a dual role for the Pitboard possible. The controller can recognise whether player has selected notes for playing, or values for MIDI Control Change messages.
Hence, double-tapping a pit sends various MIDI CC messages as described in the image below. After the initial double-tap, only one tap is needed.
Octaves 1 and 2 are reserved for sending values from 0 to 127 for MIDI CC number 60. Octaves 3 and 4 are “switches” for MIDI CC numbers between 70 and 81.
To access a specific channel, MIDI CC messages from Octaves 1 to 3 are sent via the selected MIDI channel.
As DAWs have some global functions such as starting a recording, MIDI CC messages selected from Octave 4 are always sent through channel 5.
Show – Don’t Tell
To see the novel features in action, there’s a video below. For some reason, that performance gave me a 60s live concert vibe.
To see videos with less noise and not so frantic drummer, please visit: http://www.deomo.com
Here are some points of interest.
In the beginning, percussive sounds are played on MIDI channels 1 and 2 by tapping. Then, Raging Bass from Waldorf Nave on channel 3 is added to the mix.
At 0:33, Breakbeat loop clip is launched by double-tapping a pit. The double-tap sends a MIDI CC message that has been mapped to the Breakbeat loop slot. Later, at 3:39, the clip is toggled off in similar manner.
Starting at 1:22, Pitch Bend and MIDI CC 76 messages are sent based on data received from motion sensor.
Around 3:00, moving Kara does not affect the sound. Only after touching Motion Trigger at 3:04, the values from motion sensor are used.
At 3:32, strumming a note so that it stays on. There’s no need to reach for that Panic Button, this is by design. 🙂
In the end around 4:21, no, Kara is not altering the sound although it looks like it.
In 1986 Frank Zappa released his final studio album in his lifetime; for the remaining seven years of his life, he would only release live concert albums. Jazz from Hell is an instrumental album whose selections were all composed and recorded by Frank Zappa. It was released in 1986 by Barking Pumpkin Records on vinyl and by Rykodisc on CD. Zappa won a 1988 Grammy Award for Best Rock Instrumental Performance for this album. .
What is a “Synclavier” ?
The Synclavier was an early digital synthesizer, polyphonic digital sampling system, and music workstation manufactured by New England Digital Corporation of Norwich, Vermont, USA. It was produced in various forms from the late 1970s into the early 1990s. The instrument has been used by prominent musicians.
The original design and development of the Synclavier prototype occurred at Dartmouth College with the collaboration of Jon Appleton, Professor of Digital Electronics, Sydney A. Alonso, and Cameron Jones, a software programmer and student at Dartmouth’s Thayer School of Engineering.
The system evolved in its next generation of product, the Synclavier II, which was released in early 1980 with the strong influence of master synthesist and music producer Denny Jaeger of Oakland, California. It was originally Jaeger’s suggestion that the FM synthesis concept be extended to allow four simultaneous channels or voices of synthesis to be triggered with one key depression to allow the final synthesized sound to have much more harmonic series activity. This change greatly improved the overall sound design of the system and was very noticeable. 16-bit user sampling (originally in mono only) was added as an option in 1982. This model was succeeded by the ABLE Model C computer based PSMT in 1984 and then the Mac-based 3200, 6400 and 9600 models, all of which used the VPK keyboard.
Synclavier II (1980): 8-bit FM/additive synthesis, 32-track memory recorder, and ORK keyboard. Earlier models were entirely controlled via ORK keyboard with buttons and wheel; a VT100 terminal was subsequently introduced for editing performances. Later models had a VT640 graphic terminal for graphical audio analysis (described below)
Original Keyboard (ORK, c.1979): original musical keyboard controller in a wooden chassis, with buttons and silver control wheel on the panel.[10] Sample-to-Disk (STD, c.1982): a first commercial hard disk streaming sampler, with 16-bit sampling at up to 50 kHz. Sample-to-Memory (STM): later option to sample sounds and edit them in computer memory. Direct-to-Disk (DTD, c.1984): a first commercial hard disk recording system. Signal File Manager: a software program operated via VT640 graphic terminal, enabling ‘Additive Resynthesis’ and complex audio analysis. Digital Guitar Interface SMPTE timecode tracking MIDI interface
5
by Wikipedia..
What is interesting for us is the fact that the Synclavier was a very advanced and elaborate midi-instrument which revolutionized the music industry.
After two decades of depending on the skills, virtuosity, and temperament of other musicians, Zappa all but abandoned the human element in favor of the flexibility of what he could produce with his Synclavier Digital Music System.
The selections on “Jazz from Hell” were composed, created, and executed by Zappa with help from his concurrent computer assistant Bob Rice and recording engineer Bob Stone. Far from being simply a synthesizer, the Synclavier combined the ability to sample and manipulate sounds before assigning them to the various notes on a piano-type midi keyboard.
At the time of its release, many enthusiasts considered it a slick, emotionless effort. In retrospect, their conclusions seem to have been a gut reaction to the methodology, rather than the music itself.
by AllMusic
As I am since a few years an avid amateur of making midi based music, I took the challenge to revive some tracks of this groundbreaking album on put them on my youtube channal.
I will present one track here which is made with commercial available DAW’s and midi files which are available on the web.
“G-Spot Tornado” is a musical composition created by Frank Zappa for his album Jazz from Hell in 1986.He thought that the composition was so difficult to play that it could not possibly be performed by a human therefore he initially recorded the song using a Synclavier DMS. Zappa was later proven wrong when the song was performed live on The Yellow Shark. The piece, one of, “Zappa’s most successful Synclavier releases in the tonal idiom…,
Frank Zappa’s music keeps inspiring me since I bought my first Zappa record in 1968 and I was lucky to see him perform on several occasions live on stage.
It’s hard to find an actual Synclavier these days, but you can find information on the Synclavier at Vintagesynth.com and Arturia released a softhsynth reproduction of the Synclavier V in 2016.
The Synclavier V faithfully recreates the elite digital synthesizer/workstation that started it all, powering some of the biggest hits and film soundt…
The Glitch Mob has started touring with what may be one of the most complicated MIDI controllers every built.
The Blade 2 .0 was designed by Martin Phillips also who has done work for Deadmau5, Kanye and Daft Punk. Dell partnered with the Glitch Mob to provide Dell Alienware 15s computers that run Intel Core i7-7820HK quad-core processors. The Blade 2.0 has three Dell Canvas 27-inch touchscreens that are programmed as MIDI controllers for melodies and kick and snare drum patterns. An Alienware 17 plugged in via a MIDI HDMI cable connects the whole show together.
“We have all these crazy turbine-looking drum things and inside of those are Roland PD-125X V-Drum Mesh Snares. We hook all those up to a Roland Octapad, which all three of us have sitting about 10 feet behind us.”
by Edward Ma from an article on Music Radar
Nerdmaticsis the LA-based tech team that wrote the Max patches to drive the show. They also have done some other projects we’ve covered like the Intel CES keynote.
Here’s some more of the Glitch Mob and the Blade in action.
Here links to some other articles about the Blade 2.0
Tristan Shone A.K.A. Author and Punisher is a musician and mechanical engineer who makes heavy metal MIDI controllers, really heavy metal !
He has created a range of unique MIDI controllers he calls “dub machines” using his electronic and mechanical engineering expertise. Tristan studied for a Master of Fine Arts at the University of San Diego and originally had a career as a mechanical engineer. He programs his mechanical devices using Arduino.
Here is his Big Knob controller and it’s really, really big!
Big Knob expression port knob controller.
This simple device is a heavy-duty CNC machined black anodized knob for use with your expression port on any midi/usb keyboard controller. Simply plug into your expression port and immediately have a 0-128 mappable control knob. Currently there are 10 spring loaded detente positions and a hard stop for quantized physical snapping, however by removing the 1″ chromed steel ball bearing and spring, you can create a smooth position knob controller.
2
by Author and Punisher
Another unique MIDI device is the Rack and Pinion
This device is a 2 level, 6 key sound controller with continuous pitch control for each key. Each key is velocity sensitive and contains its own linear encoder for extremely high resolution control. The keys are made from ebony and the slide from delrin providing very smooth control against the teflon coated linear rails. The brain of the Rack & Pinion is the Arduino Duemilanove. Currently the device is programmed to output USB/MIDI to Ableton Live, but can easily be configured to output OSC commands to communicate with Pd, MAX, Reaktor, etc.
by Author and Punisher
But maybe our favorite is Rails. And yes , it’s a MIDI controller!
Overall, the Rails device is intended to be the “sequencer” or metronome of the performance but lacking a machine-like precision or click. Instead the user can continuously move between the limits in his/her own rhythmic or arrhythmic manner changing sounds at each point and fluctuating in a more human and emotional manner. Performances with the Dub/Drone Machines are intended to follow the players mood in a somewhat unintentional improvisation in a way that a midi timed sequence cannot. 2 ports on the back (3 total with the USB) provide sustain and channel switching but are completely reprogrammable to allow them to be any type of switch or expression control.The brain of the Rails is the Arduino Duemilanove. Currently the device is programmed to output USB/MIDI to Ableton Live, but can easily be configured to output OSC commands to communicate with Pd, MAX, Reaktor, etc.
by Author and Punisher
Check out this video by Noisey that details Tristan Shone’s journey to explore the heavier side of MIDI controllers,
Author and Punisher is performing at MoogFest in May.
We got several entrees for suggested content from the December 2017 MIDI Association newsletter. One was for an article on CTRLCap from Edwin Joassart. We did a little more research and decided to do an article not just on CNTRLCap, but on the developers behind it -Herrmutt Lobby.
Herrmutt Lobby has created not just music, but hardware and apps to allow people to interact with music. After all, their mission is Empowering Real-Time Electronic Music.
They have four major projects-
Playground-Music At Your Fingertips
BEATSURFING-The Organic MIDI Controller Builder
Le U (20syl)-The Interactive Skateboarding Ramp
CTRLCap, the cap you squeeze to control FX
Founded in 2003, Herrmutt Lobby is a collective of musicians, handymen, and programmers. Since 1997, the individual members of the group have released music on various labels – DUB, Studio !K7, Vlek, Eat Concrete, Thin Consolation, Catune – and across genres.
Alongside music, they’ve also devised and built various softwares, controllers, and apps that help musicians perform live with the freedom to express at the moment’s inspiration/instinct.
Their ever-changing musical universe grows through encounters with musicians from diverse horizons, most recently the Belgian jazz player Stéphane Mercier and UK rapper Lord Rao.
1
by Herrmutt Lobby Website
CNTRLCAP-The cap you squeeze to control FX
Ever tried to add FX while scratching without breaking your flow? With both hands busy controlling the fader and the record, it’s pretty much mission impossible… Unless you own a CTRLCap.
With its cutting edge technology, CTRLCAP adds an expressive touch to your fingertips!
This Application lets you to draw a 3-dimensional controller which you can use, like any other, by tapping. But Beatsurfing allows more: you design your own paths until they suit you, follow routes, take turns and cuts with your fingers and collide with objects along the way, triggering melody, beats, effects . Movement is what it’s all about.
It can control any MIDI-enabled device (Software, Hardware, or even selected iPad apps), features a very intuitive in-app editing system and integrates seamlessly in any existing Studio or Live setup. Objects Behaviours can be set to link objects together and multiply the available commands on the surface of the iPad.
LE U – Interactive Skateboarding Ramp by 20syl
Skateboard culture and electronic music, those are the two mail components of 20syl’s artistic construction. He has practiced these disciplines for twenty years or so and has never stopped building bridges between them. Today, along with La Région des Pays de la Loire and les salles de musiques actuelles(places dedicated to modern music VIP, Stereolux, Chabada, Fuzz’Yon, 6par4, Oasis), he presents a project which could be the achievement of this crossbreed: a Sound Ramp, a “U” turned into a sensitive and visual surface allowing the skater to perform music that the artist has composed.
We want to thank Edwin Joassart from Herrmutt Lobby for reaching out to us and hopefully you enjoyed this quick tour of the many Herrmutt Lobby MIDI projects. MIDI is often at the heart of these kinds of innovative projects.
What if you could control MIDI with your brain? Does that sound like science fiction? Actually there are lots of people who have been exploring how to connect brain waves Electroencephalography (EEG) to MIDI.
Let’s take a look at how that works.
Electroencephalography (EEG) is a method to record electrical activity of the brain. Typically electrodes are placed on the scalp to measure voltage fluctuations caused by ionic current within the neurons of the brain.
Delta is the frequency range up to 4 Hz. It is usually the highest in amplitude and the slowest It is seen normally in adults in deep sleep and in babies. EEGs are different for different ages, but we are focused on adults
Delta wave
Beta is the frequency range from 15 Hz to about 30 Hz. Beta activity is related to movement and beta waves with multiple frequencies is associated with busy thinking and active concentration.
Theta wave
Theta is the frequency range from 4 Hz to 7 Hz. Theta is associated with relaxed, meditative, and creative states in adults
Beta wave
Alpha is the frequency range from 7 Hz to 14 Hz. Alpha waves are directly related to relaxation, and attenuate with mental exertion.
Alpha wave
So what does all this have to with MIDI. Years ago EEG machines were really expensive, but recently there are wireless EEGs like the Mindset and the Muse that are inexpensive and wireless
Brain2MIDI convert brainwaves into MIDI signals. Apply filters and algorithms to the frequencies and generate MIDI to control your favourite music production software, synthesizer or visual effects software.Brain2Midi is an Android software that produce Midi notes and controls change signals using brainwaves. Midi is transmitted from an Android 4.4 device using either a USB to MIDI cable to any compatible physical input, or using Wifi or Bluetooth to a computer on the Windows 7 platform. The Muse headband from InteraXon is used as an input source for brainwaves, then informations are analyzed and converted into melodies or CC parameters. Brain2Midi can be used to create music that is influenced by the state of mind of the person wearing the headband or it can be used to create visual animations in any Midi compatible VJ software.
by Brain2MIDI
The OpenEEG project is a website with resources for making plans and software for do-it-yourself EEG devices available for free (as in GPL). It is aimed toward amateurs who would like to experiment with EEG.
MindMIDI is a revolutionary way of making music, with your brainwaves, in real-time. Brainwaves are like radio stations, with each station working on a different layer, and all the stations are always playing. MindMIDI works like a radio, allowing you to hear your brain’s amazing electrical symphony. The music can be influenced with intention, and you can hear the immediate musical feedback. The MIDI can be routed to any DAW so you can have realistic sounding sampled musical instruments, or synthesizers. You can have multiple instruments, and each instrument can be controlled by a different band of the brainwave spectrum. For example, your Delta and Theta waves could be controlling a cello, Alpha waves could be controlling a piano, and your Beta and Gamma waves could be playing a violin. Best thing is MindMIDI is free!
MIDI gets used for so many different things and one interesting MIDI application is plant music.
The story of MIDI and plants starts quite naturally with the CIA. Yes, that’s right “that” CIA, the Central Intelligence Agency. Grover Cleveland “Cleve” Backster (which you must admit is a great name for an interrogation specialist for the CIA) founded the CIA’s polygraph unit shortly after World War II. In the 50’s he reportedly was involved with the CIA’s experimentation with LSD. In 1960, he left the CIA and founded his own polygraph school teaching police how to use and administer polygraph test. He was a lifelong member of the American Polygraph Association (APA) and in 2006 the APA Board of Directors established the Cleve Backster Award. This award is to be presented annually honoring an individual, or group, that advances the polygraph profession through tireless dedication to standardization of polygraph principles and practices.
Considered by many at the time to be the top polygraph expert in the world, Cleve connected his lie detection instrumentation to the leaf of a dracaena cane plant on February 2, 1966.
Thirteen minutes into the experiment Cleve threatened to harm the plant and an electrochemical reaction occurred on the lie detector instrument. The reaction was similar to when a human responds to stress or a threat. The field of bio-communication was born and Cleve moved to San Diego to further his research and locate his polygraph school in a warm climate. Author Peter Tompkins and Christopher Bird wrote about Cleve’s extensive test results in the book titled The Secret Life of Plants. Tompkins and Bird earned thousands if not millions of dollars while Cleve earned no financial benefits from the book.
The world of plants and its relation to mankind as revealed by the latest scientific discoveries. “Plenty of hard facts and astounding scientific and practical lore.”–Newsweek
Do plants like music? It’s a controversial topic: Studies have supported the claim that music can result in better growth, but many disagree with those findings. Hear both sides & decide for yourself.
Stevie Wonder’s Journey Through “The Secret Life of Plants” was released in 1979 as the soundtrack to the documentary The Secret Life of Plants, a film based on the book by Peter Tompkins and Christopher Bird. It featured Syreeta Wright (Stevie’s wife at the time) and Michael Sembello who wrote the song “Maniac” a huge hit from the blockbuster film Flashdance. So already plants were having an effect on the music scene.
But our interest is in MUSIC BY PLANTS! Here are some quotes and videos from people who believe that plants can make music with MIDI! Using the same principles of biofeedback discovered by Cleve Backster and the detection of miniscule changes in capacitance, people have developed devices to turn these electrical signals into MIDI.
Since the 1970s, Damanhur—a Federation of Communities with its own constitution, culture, art, music, currency, school and uses of science and technology (www.damanhur.org)—has researched communication with the plant world. As part of this research, they created an instrument able to perceive the electromagnetic variations from the surface of plant leaves to the root system and translated them into sound.Science increasingly supports the concept that plants operate with an innate intelligence and logic diverse from our own. Music of the Plants has taken this research into plant intelligence and plant perception to another level. By deciphering and registering the impulses and interactions of plants, they have developed a device that uses a MIDI interface to transform the impedance from a leaf to the root system of a plant into music. Extensive research continues today as we become conscious of the innate ability of nature to communicate with us when we have the instrument to listen.
by Music of the Plants
In the following video, Simone Vitale explains the tech behind the MIDI interface for plants.
In this video I reply to all those who asked me how the music of the plants work and more specifically whether the sound in the recordings come directly from the pants.The U1 device allows plants to produce sounds and to make music. It does so by measuring the electrical resistance of vegetable tissues and transducing it into a MIDI signal (Musical Instruments Digital Interface). The MIDI signal then controls a synthesizer that produces the actual sound.At first, it might be difficult to assimilate the idea that in the end the music produced by the plant is not only an automatic outcome of this electrical connection, rather a sort of “awareness” of the plant was also involved.This is what the researchers in Damanhur (the developers of the U1 device) have found out in their forty years research. They say that after some time of being exposed to their own sounds, plants seem to become aware that the sound is coming from them and they start modulating it intentionally.I witnessed this myself years ago, while rehearsing for a live performance. I found myself spending hours playing piano together with a plant and I was witnessing the slow development of the process. The subtle changes in the plant’s music in response to the sound of the piano and its own sound was becoming more and more evident to me.
by Simone Vitale
In the next video, Simone improvizes with the U1 MIDI interface and a plant.
But there are others who are exploring this intersection of mysticism, science, music and MIDI.
Mileece is another sonic artist who uses plants to generate music.
This is a close up of the UI MIDI Interface and synthesizer
MIDI Sprout is an instrument that translates biodata from plants into music. We lead workshops on connecting to intuition through deeply listening to plant music and stream live plant music from around the globe on Plants FM.
To be honest, we are not sure if any of this biofeedback “science” is real, but any story that starts with CIA and ends up in a “Federation of Communities with its own constitution, culture, art, music, currency, school that uses of science and MIDI technology to research communication with the plant world” seemed like something we should cover here at The MIDI Association, the community of people (and perhaps plants!) who work, play and create music with MIDI.
LET’S is a team of Seattle multi media artists (Andy Arkley, Courtney Barnebey and Peter Lynch) who create interactive sculptures that combine art and sound using MIDI.
In their latest installation , We, now on at MadArt gallery in Seattle, the LET’s team uses DIY controllers and an in-depth knowledge of MIDI to allow up to 12 people to easily collaborate (even without any musical background) to create music and art.
A similar installation FINGER POWER! was created for the Seattle Bumbershoot 2014
Lynch sets up Ableton such that it sends MIDI notes to the VJ software Resolume, which controls the projected video elements. Ableton also sends MIDI notes to DMX, a piece of software that triggers the sculptures light bulbs.
SWEEP was another immersive interactive installation of sound and over 250 small scale light sculptures that were synchronized with MIDI developed by Lets for Gallery4Culture in 2015.
Another interesting project done by Lets is Library Science which recorded three albums between three albums:
High Life Honey (2004) The Chancellor (2007) Dolphin (2009)
They even included a video on the process of creating Library Science music. Add Echo!
For more information on Lets, check out their website below. .
Paul Prudence is an audio-visual performer who uses MIDI to create live-cinematic visual-music experiences.
His weblog Dataisnature explores the relationships between natural processes, computational systems and procedural-based art practices. He also writes for Neural and HOLO magazines.
The screen-shot below shows the complete Ableton Live arrangement view for Chromophore. The of top half of the tracks represent sound design material, while some of the bottom half are responsible for sending Midi data to VVVV for precise synchronisations. Midi Data also flows in the opposite direction, from VVVV to Ableton Live. Triggered by generative objects in a 3D visual system, Midi is sent to top track, in Live, which is a Sampler. In this way multiple instruments/sounds/samples and filters can be accessed by tweaking the visuals during a performance.
Controllerism definition from 2007 Remix Magazine Article
Moldover is one of those rare artists who not only creates music, but also creates the instruments to make that music. He has built his own MIDI controllers and interactive installations called Jam Boxes.
Moldover’s Controllers
Moldover has designed a lot of different MIDI controllers, but there are some that are core parts of his musical life, The MOJO pictured below is featured prominently in Moldover live performances.
Photo by Laura Lea Nalle
A guitar player and songwriter since childhood, Moldover’s early influences included Tool, Nirvana, and Pink Floyd – influences that would ultimately touch his work as a DJ. To pursue music professionally, Moldover decided to hone his chops at Boston’s Berklee College of Music.
by Moldover Website
Moldover combined his love for guitar and unique MIDI Controllers in designing the Robocaster.
JamBoxes and Moldover’s Octamasher
Moldover has always pushed the limit on musical interactivity.
We like jamboxes and we think that Music is a universal language Music is a social experience Music technology provides exciting new ways to collaborate
by JamBoxes.net
Moldover explains the concept behind JamBoxes in this video.
Moldover’s Music
So what do you get when you put this all together into a musical performance? Here are some examples of Controllerism at its finest from the man who invented the word.
Always pushing the limits of technology, Moldover provided a 360 degree music video preview of “Not Your MIrror” and a physical version of his new song that was not just a USB drive that looks like a cassette tape, but is also a musical instrument called the Voice Crusher.
“A musician at heart, inventor born of curiosity, and innovator by necessity, I believe the world calls him the ‘Godfather of Controllerism’ for damned good reasons.” – .
Brockett Parsons is a keyboard player from Summit, New Jersey. Brockett studied at Bucknell University and played both piano and trumpet. He went on to study further at Berkelee College of Music in Boston.
In 2009 Brockett then became a winner of the MTV reality show “Making His Band” featuring P Diddy. The show was an intense 12-week competition for musicians to become part of Diddy’s touring band. Out of thousands who auditioned, Brockett was one of seven winners personally chosen by Diddy. A few months later in January of 2010, Brockett then auditioned and accepted the position as keyboardist for Lady Gaga, a position he currently still holds.
by Reverb Nation
But Brockett is more than just an incredibly talented keyboard player, he also designed his own MIDI controller. He wanted contacted an old friend from college and they put together a team to build the Piano Arc, a completely circular keyboard. Here’s an overview of the Piano Arc from Cosmos Music in Canada. Chuck Johnson, Dave Starkey of MIDI 9 and Steinway technician Rich Fell have worked to put together the unique MIDI controller that features multiple zones and customer LED displays.
Brockett featured the Piano Arc in his solo release – Three Point One Four
Check out Brockett and the Piano Arc at the Super Bowl Lady Gaga halftime show on February 5, 2017.
Whizdom Music and Moforte announced the release of Geoshred 2 this week with tons of new MIDI features including Multidimensional Polyphonic Expression.
GeoShred, Winner of a 2017 Electronic Musician Editor’s Choice Award as “one of the most innovative, groundbreaking products to emerge in the past twelve months”, has been enhanced with unprecedented MIDI/MPE I/O control, new effects, and additional model control parameters.
by Whizdom Music
Whizdom Music was founded by Jordan Rudess, keyboardist for Dream Theater who we have covered in an exclusive interview for MIDI.org.
WHEN DID YOU FIRST GET INVOLVED WITH MIDI? That was so long ago. One of my first exposures to MIDI was an Atari compute;I have very fond memories of those days …..
There were several demonstrations at the Annual General Meeting of the MIDI Manufacturers Association. Here are a few of the highlights. Jordan Rudess of Dream Theatre showed how expressive MPE could be in the right hands at the MIDI Manufacturers Association afternoon sessions on Sunday.
To really understand the origins of MIDI, you need to go all the way back to before there were digitally controlled synthesizers and computers, In fact you need to go back before there was even electricity to the very first mechanical music machines
Now Kogumi‘s Anatole Buttin and Yan Godat have developed new mechanical boxes that combine Arduinos, marbles, mechanical devices and MIDI together to appeal to kids in educational electronic music workshops
If you have never run across Claude Woodward, The Sonic Manipulator while searching the web then you really have to start with his own description of his origins.
Greetings Earthlings,
I have had many interesting adventures on my way to becoming a spaceman, such is the nature of the convolutions of life. After taking a spin out from Mars one day, many years ago, a blown ion drive forced me down to a little orchard outside Perth where, under the alias of Claude Woodward, I was raised by a pair of horticulturalists. I grew up in amongst a million different species of fruit, nuts and flowers; an idyllic little haven, but my brown thumb hastened me into a career as a keyboard player/sonic manipulator.
by Claude Woodward
So what exactly does this Martian stuck on earth do. Well, he has been busking and creating weird, eccentric dance music for years. Here is an older video of him busking on the street and warning people that the Martian’s are coming ( and may drink all your beer!)
But for all his quirkiness, he actually develops some really cool DYI MIDI devices. Check out his description of his home made keyboard set up.
This is Claude’s latest video on how to really play a synthesizers. He may not give Roli a run for their money in the marketplace, but you have to love his passion for using MIDI to the max.
So if you hear that the Martian’s are coming, don’t worry they are MIDI Martians and very friendly!
There have been fan-created Daft Punk helmets before the Love Props’ GM01. You could buy a Halloween mask/prop for around $200 like this one pictured below.
In 2010, Volpin Props did a YouTube video about “How to make a Daft Punk helmet in 17 months” that got over 4 million views.
But Love Props has taken the Guy Manuel helmet recreation to the next level and what really sets it apart are two things. The level of detail of the design and what else, but MIDI. In the past week (Sept 4-11, 2016) , this helmet has gotten a tremendous amount of attention on the Web, but we decided to focus on the MIDI implementation which usually just gets passing coverage in the wider press.
First, let’s take a look at a gallery of Love Prop’s photos of the helmet.
Here’s a video of the Love Props GM01 in action.
But this isn’t a prop, this is a fully function MIDI-driven device. Let’s take a look at some of the details of the design.
The helmet can receive MIDI from a number of different sources- MIDI files off an SD card reader, Wireless MIDI from the WiFi connection and wired MIDI.
Comunications The system has an IN/OUT USB MIDI that allows the user to execute real time MIDI sequences. This wire connection allows, at the same time of the MIDI, a based on IP Telnet communication, which makes possible the interaction between the code system and the applications executed on a PC or Tablet.There is a RF WiFi module included that allows wireless communication between the system and a Smartphone, Tablet or PC for wireless real time MIDI transmission, to access the system via user IDE or to communicate with PC/Tablet application. The Wifi module (ESP8266) has its own dedicated local code and processor for the Wifi TCP/IP communications, which sets the Teensy free for processing the main code of the unit.
The SigmaFW is a creative and artistic tool that makes easier to custom animate led setups without programming knowledge. Using our custom MIDI library, the user can plug the unit via USB to the computer and live compose new LED animations in any music/midi production software or workstation. This MIDI library gives the user absolute control of the color, brightness, saturation and timing without coding, making possible to create LED animations in a more expressive and artistic way. Also, is intended to auto-generate real time animations based on customizable parameters and responding to multiple hardware/user inputs.
The 1.0 version, features a built from scratch Beat Detection algorithm that determinates the tempo or BPM of incoming music from the mic/line input and modifies the BPM of the MIDI animation being played, to match the tempo and rhythm of the external music. Also the BPM of the MIDI animation can be set by motion, following the rhythm with the head just like a TAP Tempo.The user can control the status of the system with the Menu displayed on the inner 2″ LCD of the unit and navigate through the options with the control knob, or can remotely control the system menu with a smartphone via Wifi or RC.
by Love Props
We always have a hard time deciding which is the coolest MIDI maker project of the year, but the Love Props GM01 has to be in the running.
That was so long ago. One of my first exposures to MIDI was the Hybrid Arts sequencer with an Atari computer, I have very fond memories of those days and somehow that older software seemed more stable then some software these days, maybe because today every program is trying to do so much.
Hybrid Arts MIDI Track ST
My first keyboard was a MiniMoog synth, but of course that didn’t have MIDI.
The first time I used MIDI was when I bought a Seil keyboard and rack and took a 5 PIN Din cable and connected them together. It was like magic back then.
SEIL Opera 6
Where did you learn about MIDI ?
A Music Store in Maryland. I was there all the time playing on the latest gear.
My first experience doing sequencing with 4 Casio CZ101s MIDIed together. Iwas recreating Debussy and Bach on the Atari.
Casio CZ-1
Then I moved on to MOTU Performer and a Roland D50. I wrote this big 20 minute prog rock piece and my goal was to use every feature in the DAW. I remember doing a lot copying and pasting of MIDI tracks, assigning different sounds and delay and detuning tracks or changing octaves.
How Does MIDI allow you to do what you do
MIDI has been with a huge part of everything I do in the studio. When I do any work in the studio even today I do it the MIDI space.
I still find that recording in MIDI is better because of the control being able to change velocities and tweak sounds after the fact. It’s much more flexible than audio.
Not to mention my work writing and orchestrating , I’ll get inspired and do 4 measure off the cuff and can then go back and look at it in notation. This helps me do the orchestration based on the MIDI notes.
One interesting thing about my work with MIDI was years back like all of us I was sending MIDI all around the room I remember getting up and going over to every synth, checking global pages and loading data. Steve Horelick from Non Linear Educating and Ask.Audio walks into my studio and says “Why are you doing that?How can you babysit all these keyboards.I just work in the virtual world.”So Steve introduced me to the world of virtual synths and now a lot of physical keyboard are not getting as much attention in the studio. A lot what I do in the studio is virtual. But I’m still using MIDI. I use MIDI more than ever going to virtual destinations. MIDI seems to adapt to the new ways people work.
But going into my Dream Theatre world where a lot is audio. My live life is different than my studio life.I have always tried to basically use one main keyboard which is a pretty unique approach in prog rock .Mostly live I am using MIDI for when I play the Seaboard, Continuum or the Zen Riffer keytar connected wirelessly by MIDI to my rack.
Jordan with the Zen Riffer in Argentina
But that reminds me someone built a MIDI compass for changing the tilt of my mechanized keyboard stand. Let me see if I can find some documentation on that for you. (Later, Jordan sent us prototype designs of the MIDI compass product concept to share on the site)
Jordan has always been into experimenting with unique and different controllers like the NuMotion curved keyboard pictured below.
In this next clip, Jordan uses the AXIS 64 which has a uniqueHarmonic Table note arrangement. There are 192 keys, so it’s like having three 64-note keyboards all in one.
Jordan uses the Continuum with Dream Theatre to show off its expressive capabilities.
Jordan doesn’t stick to just keyboards either. It’s hard to say exactly what the Eigenharp is, but it definitely is not a keyboard!
All these controllers have something in common which is very important which is to have independent control on every note. Trying to bring it all together.I have been using it for years now even early on with prototypes of MorphWiz MIDI. MIDI carries on to my work with my app-Geoshred.
by Jordan Rudess
MPE (Multidimensional Polyphonic Expression) which allows for per note control is something I am really excited about and glad the MIDI Association is working on standardizing because I have been using that a lot.
Hi, Like most of us, my setup is always evolving, but this is its current state. I work mostly OOTB (Out of the Box), but once I’ve got something going I’ll record into the computer for rearrangement and editing (esp. for length!). I have two main MIDI work stations, with a BeatStep Pro and a Squark Pyramid as the respective controlling sequencers. I also have multiple MIDI-synced stereo loopers for guitar and whatever else calls for it. Here’s a breakdown of the two MIDI stations (left-right):
Station 1:
Korg MS2000
Roland HPD-15
KAOSS pad quad
Beatstep Pro
Samson SM10 mixer (under the BSP)
iPad 2/Alesis IO dock
Electro-Harmonix Epitome
Moog Minitaur
TC Ditto x4 (on floor, obscured by table).
Other than the MS2K and HPD-15, this is all mounted in a Furman road case for portability.
Bottom – Matrix-6R, Yamaha MJC8, Roland MKS-7. The rest of the gear belongs
The MIDI Association Artist Interview
•Tell us about yourself briefly.
I’ve been playing music most of my life; I got started in the 70s as a guitar player via a healthy infatuation with progressive rock and fusion. Today I spend most of my time earning a living outside of music, but playing, writing, and recording music are still my obsessions.
•What was your first encounter with MIDI?
My first exposure to MIDI was in the mid-80s when the bass player I played with, a guy named David Garza, got a DX7. So I would say he turned me on to MIDI. A couple more keyboards came through our hands including a Korg DSS-1 and a PolySix. It was incredible that they could be stacked and synced via MIDI; we incorporated sequenced keys into our live performances starting in 1986, which was great for doing Peter Gabriel and David Bowie covers as well as our original tunes. Around 1987 I got an Alesis HR16 and a couple of midiverbs to support my 4-track recordings. By ’89 I had a PC with Cakewalk 2 (DOS) and a MIDI interface, which I synced to the 4-track via SMPTE. Next came an MKS-7, Matrix 6R, Peavey SP/SXII (yep, that’s right), SY77, ADAT, Yamaha O1V and a MIDI-based guitar rig centered around a Marshall JMP-1 MIDI-controlled preamp plus Quadraverb GT. This is way too much detail I’m sure but writing it makes me realize how central MIDI has been to my musical life.
•How do you use MIDI today?
These days, due to the limitations I have in terms of time, I mostly do improvised music and jamming, and MIDI is as important as ever. I use an iConnect MIDI 4+ in my live rig along with thru boxes to plumb MIDI wherever it needs to go. For my guitar rig, MIDI is used to keep things in sync, but I also have a MIDI pickup that I use to double up guitar lines w/synths. I use MIDI-synced loopers (Pigtronix) and effects on guitar, and keep them in sync with my table-top synths and effects. With this setup I can create solo soundscapes all day long, or fill out a duo or trio to make it sound much bigger and deeper. I have various keyboards, sequencers and controllers as well. I’ve upgraded to an O1V96i for my main desk. I’d incorporated a laptop into my rig for the last 5 or 6 years, but lately I’m drawn to staying OTB.
•How has MIDI allowed you to do what you do?
Very little of what I do in a live setting would be possible without MIDI. I do like to “just play” whether it’s acoustic or electric guitar, but I quickly hear additional parts that I want to add, and it would be frustrating not to be able to do that if I didn’t have MIDI.
•Anything else you’d like to add?
I’ve probably said enough! I don’t have a lot of published output but I occasionally put stuff up on SoundCloud as Mothhaven (https://soundcloud.com/mothhaven). I’m glad the MIDI association exists and wish you guys the best.
I Started recording with midi back in 1985. Over the years I have amassed a great selection of gear. I have owned most samplers and synths and still have a huge EMU collection including Emulator 2,3,4,emax plus others mostly korg stuff and roland synths also a Chroma polaris (my 3rd) around about 100 midi synths,samplers & Expanders. all piped in to 3 MX9000 Euro Desks chained together to produce 144 inputs! all fed in to a Mac Pro and logic plus a Tascam 2488 for back up.
The MIDI Association Artist Interview
••Tell us about yourself briefly.
Studio owner
1987-1989 was part of the electronic band Athena with Andrew Hughes and Keith Larkworthy.
Recorded the album DREAM ODDISSY using 2 X korg poly 800’s , Elka soloist 505 Casio pt20 Casio ct102 , Casio Vl , Yamaha dd10 drum machine.
All down to a tascam porta one.
Second solo album THIS LAND recorded between March 1992 and March 1994 and the first to be released on the Sincity Records label.
The album was recorded in a new purpose built studio- Worlds End.
Equipment used for the album was..
Roland D20 ,Roland TR505 , Casio cz5000 , Yamaha DX21, Korg Wavestation , Emu Emulator 2 ,Emu Emax 2, mixed with a tascam mm1 sequenced with dr T running on a mega Atari ST. This was the first recording we did going straight to digital tape (Philips Dcc)
The vocals were recorded through an Shure sm58 direct to a digitech compressor and effects.S1998 saw the release of the album MASQUERADE 88-98
This album was a 10 year compilation of the last 3 albums plus previously unreleased material.
It included the track tiger lily which used over 500 samples of sounds captured out and about on location with a Sony minidisc.
The next album to be released on the Sincity records label was Dangerous LOVE
this album introduced other vocalists and we went on to form the Pop group The Final Demand.
The Final demand were KELLY WIGGINS KEITH LARKWORTHY AND PETER JAMES-STEPHEN.
In late 1998 the double a side single SOMEONE/OVERDRIVE was released this sold very well and had a considerable amount of AirPlay .
In 1999 the album FRESH FROM THE BOTTLE was released. This album was mainly written using the emulators and a korg wavestation and trinity.
By August 1999 the band had split and Worlds end studios relocated.
2000 -2005 Director of Athena Music Systems
Studio rack mount computer systems
ATHENA/EMU SYSTEMS VAMPIRE RACK
SEE MUSIC PRESS OF THE TIME
The Vampire Rack PC was the worlds first dedicated 2 unit 19 inch sampler PC. It could import and export any sample in any format and had a 6 tray raid hard disk system . It was the first dedicated rack mount sampler to use the Emu Systems Emulator X software and 1212 hardware.
In October 2004 edition of music tech magazine there was a centre page review of the vampire rack where it scored 7 stars…
Athena systems also manufactured liquid cooled systems for the construction industry
The Lighting system built using the shuttle bare bone shells
2005-Present
Owner of label and Polaris studios (previously known as WORLDS END STUDIOS
POLARIS STUDIOS is the current base for Peter James-Stephen’s music output
Currently working on the 7th and 8th studio albums
Titled OLYMPUS AND BREAKING THE SOUND BARRIER.
Also working on film and music video to accompany new albums and will be touring towards the beginning of 2017.
•What was your first encounter with MIDI?
1985 was the first time I used midi with a CASIO cz5000 joined to a Sequential Prophet 600 with an Atari ST
Never looked back since!
•How do you use MIDI today?
The studio is split into 4 midi zones
Each midi zone has 2 Philip Reese v10 midi through ports allowing 20 instruments to be connected in each zone without chaining them.
Each zone is then fed in to my master keyboard which is a CME VX8 this has 4 midi outs.
The CME VX8 is them connected to a apple pro running logic via usb.
The beauty of this system is o only have to switch on the synth or sampler I want to use at the time
This reduces unwanted noise in to the system.
All of the instruments are always connected to my 3 behringer mx9000s which are joined together using the X pander ports to give 144 inputs at mixdown
•How has MIDI allowed you to do what you do?
Without Midi I wouldn’t have been able to record or tour with my rig!
My rig is focused on Modularity. In addition to the Eurorack Modular synthesizers, I employ a modular MIDI setup which consists of two Alyseum AL-88C MIDI>Ethernet interfaces. These MIDI Interfaces connect to a high-speed switch which is part of my larger home network. This means that any computer on my network can access the MIDI ports. The MIDI Interfaces can operate stand-alone (without a computer) or with the Copperlan software, to route MIDI in and out of my computers and even between the two MIDI interfaces. This allows me to route MIDI data from any keyboard or sequencer to any MIDI compatible sound source. Here is a list of MIDI equipment I use:
Roland V-Synth GT
Elektron Analog Four
Elektron Octatrack
Mutable Instruments Ambika
Nord Drum
Nektar Panorama P6
Nektar Panorama P1
Arturia MicroBrute
Arturia BeatSetp Pro
Novation Launch Pad I
ntellijel uMIDI (MIDI>CV Interface) 2x
Alyseum AL-88c
E-RM MultiClock
The Multiclock is a very important part of my rig.
The MIDI Association Artist Interview
Wow, I’m totally surprised and honored!
Well, my name is Justin Sullivan, I’m also known as justin3am in various online communities. I work with sound and attempt to make music. 😉 I don’t know, I always have trouble talking about myself… on the other hand I can talk about gear all day!
My first experience using MIDI was with a Yamaha PSR keyboard, a parallel port>MIDI adapter and a MIDI sequencer that I got from the local computer store (remember those?). I think it was Magix Music Maker; whatever it was, it came on a bunch of floppy disks. I didn’t have any experience playing the keyboard… but that did’t matter as I was fascinated by the way it communicated with my computer.
Today, MIDI is an integral part of my approach to music. I’m just a single person, so the ability to control multiple instruments from a single controller opens a lot of possibilities. Sequencers allow me to compose compositions that I wouldn’t be able to achieve otherwise. Combining the two via a computer gives me freedom to make a dynamic performance out of static sequences and phrases.
Many modern music applications have features to get very complex results from simple control messages (CCs, notes, Program Changes). For example, using a single physical control to adjust several parameters at once or scripting routines of events which can change depending on variables and logical arguments to make generative compositions, which are still controllable. Most of that is not made up, I promise!
Since I’ve started using MIDI over my studio’s network, I’m able to pass messages between multiple computers. I use one mainly as a tape machine and the other as the primary MIDI sequencer. MIDI over LAN allows me to keep both machines in sync. Of course you can do the same with a USB MIDI interface but I find it much easier to manage complex MIDI routing using the CopperLAN software with my 2 Alyseum AL-88Cs. Both interfaces are connected to a high speed switch, which enables me to sed MIDI between either computer and the instruments/effects in my studio.
I’m very interested in using MIDI with easily programmable micro controllers (i.e. Arduino) to drive motors, solenoids and other stuff. I’ve found that there are many tools out there which allow me to use MIDI to control instruments which primarily speak the language of voltage (modular synths) or physical force (acoustic instruments). It’s a fantastic time to be a synth nerd/sound junkie! Ha!
Boy, that’s a bunch of words!
I’m totally stoked to be a part of this contest and to share my enthusiasm for music technology!
Dave Smith P6 and Pro2, Elektron Rytm and Analog 4, Kurzweil PC3x, Maschine, Push, Logic, Live…
The MIDI Association Artist Interview
•Tell us about yourself briefly.
My name is Hugo and I run Goldbaby, a studio in Auckland, New Zealand that focuses mainly on sound design. I sell my sample packs from my website: www.goldbaby.co.nz.
I have been using music technology since the mid 80s. It’s gone from a childhood hobby to a full time dream job!
•What was your first encounter with MIDI?
That would probably be in the late 80’s when I bought a Roland MC300 to sequence an X7000 and then a Kawai K4. I loved that set-up!
•How do you use MIDI today?
It has multiple uses. I have an extensive hardware synth and drum machine collection, so it’s essential. Not only for sequencing but for saving banks of patches. It also helps with programming synths like my Kawai K3m. Instead of using buttons to program it I have it hooked up via midi to a hardware patch editor with sliders and knobs… much more enjoyable!
•How has MIDI allowed you to do what you do?
I like to layer synths and drum machines when I’m doing sound design and I do that using Midi. A three synth stack sounds huge! I know it would probably be easier to do layering in a DAW with samples of the individual units… however my sound design techniques require lots of gain staging experiments with real hardware pre-amps and a valve mixer.
•Anything else you’d like to add?
I’m still amazed at how long we have had MIDI and it’s still totally relevant today after over 30 years of service.
Here is a video of my studio as it was 3 years ago. The music in the video was made with a combination of hardware and software and wouldn’t have been possible without midi.
HUGO HAS HAD AN AMAZING COLLECTION OF DRUM MACHINES THROUGH HIS STUDIO, CHECK OUT THIS GALLERY FROM HIS FLICKER ACCOUNT!
The Monolith The Moog Sub37, Moog Minitaur and Minimoog VoyagerXL all sync to the midi clock generated by Ableton Live. The two customized Arturia Beatstep Pro sequencers also sync to the clock, while sending their sequences back to Ableton, which distributes the midi to the synths as well as the Roland TR-8 Drum Machine. This method allows for capturing of the midi data and the printed audio separately in Ableton, rather than have the sequencers route directly to the synths. One of the Beatstep Pros sends CV and drum gates to the modular synthesizer to sync the analog sequencers as well as trigger envelopes and sync LFO’s to the midi clock. There are two midi keyboard controllers, a CME xkey25 which is dedicated to the Moog Minitaur, as well as the global transpose functions of the two Beatstep Pro sequencers. There is also a Nektar iX49 which is dedicated to the Waldorf Streichfett string synthesizer. There are three hardware delays, which all sync to midi.
The MIDI Association Artist Interview
•Tell us about yourself briefly.
I am a commercial photographer and cinematographer and I use my synthesizers to record scores and backing music for the films I make for my clients, and as a way to relax at the end of the day.
•What was your first encounter with MIDI?
I started with midi in 1986 when my family purchased a Casio CZ-1 phase distortion synthesizer. Using midi to notate into and from our Apple 2 just opened up a whole new world to me. Flash forward to 2002 when I started using Ableton Delta 1.0 and discovered soft synths. I didn’t even use control voltage until 2015, so I have worked my way backwards.
•How do you use MIDI today?
I use midi to sequence and sync three hardware synthesizers, analog and digital delays, hardware sequencers and a six foot tall, Moog format modular synthesizer. I use Beatstep pro sequencers (two of them) to sequence and clock my gear via midi and CV/gate. I know people associate MIDI with “music made by computers, not people” but for me, it has allowed me to step away from the computer and get hands on with hardware.
•How has MIDI allowed you to do what you do?
MIDI allows me to keep everything sync, automate parameters and get equipment made in different decades, thousands of miles apart, to speak the same language in real time, so I can focus on what’s important, making music.
•Anything else you’d like to add?
Thanks for the opportunity to share my love for midi and synthesizers.
From left table around to right; Roland VT3, Scarlet 2i4 (1), Macbook Pro (1), Novation Launchkey 25, Akai MPK49, Ableton Push 2, Macbook Pro (2), Scarlet 2i4 (2), KRK Rokit 5 (x2), Pioneer receiver w/ 12″ sub, Native Instruments Maschine, Pioneer CDJ800 (x2), Technics 1200 (x2), Behringer 12″ PA monitor (x2).
The MIDI Association Artist Interview
I grew up listening to all different forms of music. Everything from folk, blues, and jazz, up to hip-hop, R&B, and electronic dance. I played some alto saxophone in middle school, then gradually inched towards a passion for electronic music. I picked up my first set of Technics 1200’s when I was 15 years old and began playing parties all over the west coast (and continue this today). In 2011 I returned from being injured in Iraq and began playing around with different DAW software programs (mainly Ableton).
In 2012 I started to gather some different MIDI gear, and started learning the basics; quantization, mapping, etc. This is around the time I began getting some work signed by different labels around the globe (USA, Canada, Switzerland, France, and Germany). In the autumn of 2015, I began classes at Full Sail University, to receive my bachelor’s in music production, and one of the opening classes was on MIDI and the basics of working with MIDI, and different DAW’s.
These days when I’m making music for personal betterment, and fun, I’ve been using the new, Ableton “Push 2”, and absolutely loving its features. I also use Native Instruments’ Maschine along with the Traktor Scratch A10 setup. This allows me to use both Technics 1200’s, both Pioneer CDJ’s as well as add some personal flavor to each mix with the Maschine. When I’m doing school work, I’m mostly using my AKAI MPK 49, and the Novation 25 key, keyboards. Much of what we’ve been working on thus far is scales, chords, progressions, and other basic “musicianship” techniques.
Using MIDI has definitely broadened my technique when using different DAW’s. In school we use it daily in both Logic Pro X, and in Pro Tools. It has made time management when working on tracks a breeze; something that would otherwise make piecing together a rhythm, or melody/harmony, hours to come up with. MIDI has made sitting down in the studio each day, and coming up with something “fresh and new” each time, much easier.
Bio:
Doc Manny (AKA Steve Fields), hails from Oregon and has represented the Northwest house music scene since 1995. Doc has spent much of his DJ career living and playing in such cities as Portland, Vancouver, Seattle, Boise and San Francisco. Over the last couple of years Doc has headlined with such artists as DJ Mes (OAK), Johnny Fiasco (SEA), CZBoogie (CHI), Choco (1200 Warriors) (NYC), Mikey Valesquez (LA), Lurob (SF) and JT Donaldson (NYC), just to name a few. Doc resided in Honolulu, Hawaii from August of 2009 until April 2013 where he played such clubs as “Asylum Afterhours” which is regularly noted as one of the top 10 nightclubs in the world in DJ Magazine.These days Doc Manny has built up his label affiliation by producing tracks and working with such chart topping labels as, Jack Locker, Midwest Hustle, Wetsuit Records, Home Again Recordings Digital (HARD), Caboose Records, O.X.O Recordings, Spins & Needles as well as Sutra Sounds. In the summer of 2012 Doc started the record label that has blown up on Beatport, Traxsource, Juno, iTunes and many other digital download sites all over the Globe. This label is known as “House Call Records” and has already built up a list of affiliations by signing such names as Corduroy Mavericks, Tim Rella, Trevor Vichas, Forrest Avery, Louie Gomez, HapKido, Nic-E, the 1200 Warriors, CEV’s, 4 Peace, J. Caprice, Nick Jagger, Lipp Trixx, Marc Fairfield, Jason Wolfe, UC Beatz and the list keeps growing and growing. Pushing the envelope even further with his passion for, “tech-house”, and “downbeats”, Doc started a sister label to House Call in the winter of 2014 known as, “Double Wide Recordings”, which is already lining up the heat from some of the heaviest hitters in the electronic music industry.Wanting to further his career in the audio industry, Doc began his journey to receive his bachelor’s degree in Full Sail University’s, online Music Production Program where the curriculum covers everything from A&R to mastering/engineering a production. After completion of the program in 2018 you should start looking for “Steve Fields” compositions and “Doc Manny” production works to be showing up everywhere from TV commercials to your favorite music streaming service… Stay tuned!
The goal of my rig has always been – it must be compact and convenient for electronic music live playing supporting flexible multichannel midi routing and multitimbrality. My first MIDIsystem was centered around my own DIY MidiBox hardware sequencer project kindly shared by developer Thorsten Klose.
An 8 midi ports Miditemp midi router/processor and Doepfler MMR4 midi/merger/switcher are what allowed me flexibly to connect number of midi controllers and route them independently to any of connected synthesizers through fantastic MidiBox arpeggiators or directly as appropriate for particular situation. Unfortunately even 4U based this setup was not that easy to carry around due to the involved synthesizers and controllers. So luckily after iConnectMidi4+ midi matrix router/processor and Squarp Pyramid midi became available I was able to squeeze my rig into something more portable – single 4U space + brilliant LinnStrument
The MIDI Association Artist Interview
•* Tell us about yourself briefly.
Working as designer, I’m also passionate about electronic live music… as much live as possible 🙂
* What was your first encounter with MIDI?
Early 90-ties with Yamaha QY-20 sequencer connected to midi controller keyboard (RMIF TI-5 synthesizer) – that was my first MIDI setup
* How do you use MIDI today?
Mostly to control sound engines from controllers while playing/jamming live, programming drum and rhythm tracks and occasionally to capture performances and musical ideas
* How has MIDI allowed you to do what you do?
MIDI means everything to me while creating and playing music live, without MIDI I would probably still be stuck at prepared piano level and occasional synth jamming
* Anything else you’d like to add?
My full story with embedded links follows below –
My connection to music is rather based on texture rich multi-timbral orchestral classical arrangements than guitar rock, so aside from occasional prepared piano experiments, my musical journey started in late 80-ties, when I studied at Riga Technical University and got hands on my first synthesizer – RMIF OPUS (http://www.ruskeys.net/eng/base/opus.php ) – bi-timbral polyphonic analogue monster – functionality hardly available these days. It was lot of fun layering sounds on analogue tape, but unfortunately the result after numerous overdubbing through analogue delays and reverb was very noisy and, of course, there was no MIDI, no synchronization, no pre-sets, what made quite hard for me to reproduce anything.
In early 90-ties I discovered MIDI with Yamaha QY-20 sequencer. MIDI sequencing and multi-timbrality finally allowed to create clear sounding music and soundscapes, but then for long time I was not happy with the sound generated by digital sound engines. Later, switching to computer with Yamaha SW PCI SW1000XG made sound design and production more pleasant (like this production below called elektrogans-virtual-love), but limited sequencing options upset me again, because at that time only linear sequencing was available on computers, whereas I wanted to play/jam live combining MIDI with dynamic loops and live arpeggiators.
Next MIDI milestone for me was Korg M3 workstation with KARMA engine, what allowed dynamic live MIDI sequences and arpeggios, but was quite hard to program and required computer software to do so effectively. At the same time I also found amazing MidiBox project ( http://www.ucapps.de ) developed by Thorsten Klose and kindly offered to everyone as DIY project. I learned that MidiBox Sequencer perfectly fits into my MIDI usage concept and took a challenge and successfully built one myself also utilizing my engineering and design background and again had lot of fun playing live MIDI at my home studio. This soundtrack is an example of such live performance without overdubbing.
However involvement of MidiBox sequencer, MIDI controller keyboards, and hardware sound modules raised another challenge – how do you connect and synchronize them together? So I had to build my first 4U rack-mount MIDI rig which I built around MidiTemp PMM88 matrix and Doepfer MMR4/4 MIDI processor allowing quite flexible MIDI routing and processing so that with simple touch of a button I could easily route MIDI signals from any of my controllers to any of sound engines, through or bypassing MidiBox sequencer at any time while keeping everything in sync – yet pretty hard to set everything up.
So far great for studio, but not so great for live rehearsals, when I joined to Charisma (https://charismalv.bandcamp.com ) helping them with computers and synth sounds. Carrying that all equipment around (music is my passion and not job) on regular basis still seems rather impossible. The introduction of iConnectMIDI4plus and Squarp Pyramid sequencer allowed me to build new, more mobile self contained 4U rack which is compact enough, weights around only 15 kg, and doesn’t require external connections other than simple audio and USB (MIDI via USB), but I had to sacrifice midi routing flexibility because iConnectMIDI4plus unfortunately doesn’t allows flexible enough reconfiguration on the fly from remote controller, also beautifully programmable arpeggios from MidiBox sequence are not there any more. Thus I still find missing essential full featured connection/link between MIDI controllers and sound generators and sadly I can’t code, so this remains in my vision and layout drafts.
Desire for multi-timbrality combined with lack of computing/storage resources determined my choice in favour of MIDI based work-flow opposed to common audio layering/overdubbing approach. So MIDI means everything to me to create live layered multi-timbral electronic music while at the same time I’m really impressed with emerging MPE. When playing LinnStrument, for example, I frequently feel that there is much less need for arpeggios and preprogrammed sequences, because MPE brings such a sparkling inspiration and breathtaking experience to me. I would definitely say that with MPE MIDI has come to whole new level, making electronic instruments more alive than ever before. I’m looking forward. Future is optimistic!
Moonwerk Labs Space Station was created 5 years ago by two aspiring musicians with a passion for Electronic music. We implement the use of MIDI with the help of MOTU’s MIDI Express 8×8 units. Dane Blaesing (the other 1/2 of moonwerk) is the mastermind behind the visual design of the studio, using LEDs to showcase EVERYTHING.
The MIDI Association Artist Interview
Moonwerk consists of two people; Dane Blaesing and Zac Pescetto. We started our studio around six years ago, slowly accumulating gear as time went by. We were introduced to MIDI right from the start when Dane purchased a Dave Smith Mopho. Fast forward; we are now using two MOTU MIDI Express’ to control everything in our studio. There are sixteen synths we have full control over through MIDI. From triggering notes, to controlling all parameters with cc messages and automation in Ableton Live. We also have a road setup to implement MIDI on the go, which includes a Motu Micro Lite and numerous portable synth modules. We hand-solder all of our audio cables routed to our MIDI and outboard gear. In addition, we’ve built our Black Lion summing mixers and make all of our own repairs/upgrades/modifications to our gear. Moonwerk does everything in house, from sound design to mixing & mastering, photography & graphics, lighting, and even hand building our gear stands and modular synth cases.
This is my “Battery powered studio” currently unplugged since I’m missing the multi channel audio interface that could handle most of my synths at once for live play. I’m also hoping to record the new CD with all of them and make some live gigs in local clubs. I’m trying to limit my self only to battery powered hardware to not spend the fortune on the audio and get as many instruments as I can. My biggest dream is to buy the Suiko ST-50, TB-303 and OP-1, I’m also hoping to get the cash for Roland Boutique series, and I’m 100% sure that I will get the upcoming Volca FM. My main DAW of choice is the Propellerhead Reason, which helps me well to run the Sound Design and Soundware testing company ( www.naviretlav.com ).
Here is the full list of the gear included in photos: Suzuki Electric Taisho Koto, Korg Electribe, Gakken NSX-39 ( Pocket Miku ) ,Stylophone, Mixtape Alpha, Korg Volca Beats, Korg Volca Bass, Korg Volca Sample, Korg Volca Keys, iPad 4 GEN (mostly for Animoog), Korg SQ-1, Teenage Engineering Pocket Operators, Novation Circuit, Yamaha QY70, Yamaha SU200, Yamaha QR10, iConnect midi2+, Korg Monotron and Monotron Delay, Zoom MS-50G, Behringer BCR200, Line 6KB37, Zoom H2n, Korg Kaoss Pad and Kaossilator, Sennheiser HD380 headphones and two notebooks running Propellerhead Reason.
The MIDI Association Artist Interview
•Tell us about yourself briefly.
So, here I am, young sound designer and owner of Navi Retlav Studio, a small sound design studio that focuses on direct collaboration with Propellerhead Software, Rob Papen, Blamsoft, Synapse Audio, FXpansion and many more less known VST developers. In general me and my team, we work as freelancers with top of the line VST developers and our sounds can be found in over 30 products. My personal goal is to be well known like Richard Devine, and set new standards in the music industry, especially if I could switch from VST to hardware sound design at some point of my career. Recently I’m also experimenting a lot with convolution reverb engines, and developing our own custom impulse responses that could easily be used like brand new effect devices rather than just the average reverbs. Hopefully in upcoming months I will finish coding our first VST/RE effect for commercial release. Meanwhile I also plan to record new CD and a set of video live performances with all my gear.
•What was your first encounter with MIDI?
If I remember correctly, the first time ever when I saw the MIDI pin connector was in the old Yamcha workstation/arranger style keyboard in my aunt’s house. She was learning the basics and I always liked to play a bit with that keyboard. Where it comes to the software, I have a very short memory about playing with tracker style DAW on AMIGA 500. Than the first proper midi interface that I bought was the legendary BCR2000 which I still have and use up to this day.
•How do you use MIDI today?
Today, I use midi with iPad thanks to iConnectivitiy midi2+, sync all my gear to Propellerhead Reason DAW by linking it from by old BCR2000 and the new Korg Electribe, than I also use the pulse signal to sync with my KORG Volcas and Pocket Operators, and finally using Korg SQ-1 sequencer I play live sequences with Yamaha QY70. On top of it my BCR2000 is mapped in reason as controller for all the FX devices, and my Line 6 KB37 is the master keyboard for all synths. It’s really surprising how many synths you can chain together with midi and audio sync.
•How has MIDI allowed you to do what you do?
I use midi every day in my job. Without it I wouldn’t be able to control RE and VST instruments that I work with, and adjust parameters on the fly without reaching for the mouse. I plan to do way more live gigs in near future, and if possible turn it all into successful entertainment platform by streaming them live on twitch and youtube. For example some of them you can watch here.
It’s funny that some people think that MIDI is the relic of the past, but to be honest it’s essential tool for all of us and I can’t see the future of music without midi.
•Anything else you’d like to add?
Recently I found that in some of my gear the classic midi 5pin connector was replaced with the audio jack and adapter, it’s a nice move, until you realise that there is no standardisation for it yet and the adapter from Korg Electribe doesn’t work with the Novation Circuit. In general I wish that hardware developers could agree and keep it all the same across all the devices. Even more, I think that the idea of audio jack midi could be expanded, and the new synths could also have the option to send the audio sync pulse into that port, so we could easily sync our volcas and other gear to that clock or use midi adapter if desired. I’m also waiting for the announcement of the first ever synthesizer with the new MIDI 2.0 protocol that is still under development and might finally bring some new light and options for the musicians.
To finish it all with the good vibe, here is the quick song that I made to bring back the retro style to our hearts.
Output REV Output Signal Waves Diamond Waves Codex Waves Element
UAD Plugins (numerous)
Xfer Serum
Propellerhead Reason 6
U-he Diva
U-he Braille
U-he Zebra 2
Motu Ethno Instrument
The MIDI Association Artist Interview
Tell us about yourself briefly.
I’ve been making music and dealing with Music Technology since the late 80s. I was a sound developer for Eye&I Voice Crystal (Yamaha, Ensoniq, Korg, Roland soundcards), LA RIOT (AKAI, Ensoniq Sample CDs) and Alesis (Quadrosynth) back in the early 90s. I taught Synthesis/Sound Design at UCLA for a semester in 96 (very cool gig I must say) and had various other studio gigs throughout LA and SF in the 90s. I did the score for Libertarian Presidential candidate Andre Marrou back in 1992. I worked at Opcode for a bit in the mid 90s (great people and products) and had two record deals, three records released and some movie scores to my credit. I am now known as Swirve and besides producing Dance records I am a partner in RemixxMe Productions here in Kansas City.
What was your first encounter with MIDI?
Wow, I’d have to say it was in 1987-88. (I’m getting old) I had a Juno 106, Ensoniq Mirage, Alesis HR16 and had just got the Roland Mks-50. I needed to trigger the Mks-50 from the Mirage. I remember working on an album back in the 90’s where we had a few Opcode Studio 5s connected to over 40 synths at the same time. MIDI sure was fun back in the day, still is.
How do you use MIDI today?
I use MIDI a lot still, not so much with MIDI cables, it’s all mostly through USB, save for the Voyager. The hard part about having so much hardware versus the software is keeping it all ready to go the moment the project loads. I use SoundTower (I Highly recommend it) to control most of my hardware synths either as a librarian or even plugin in LOGIC. MIDI allows for the gear to send various MIDI data to the units in addition to patch information, making project recall and patch perusing much better and even on par with some soft synths I have.
How has MIDI allowed you to do what you do?
I know I covered a lot of this in the previous answer. In electronic music, I’m constantly evolving the sounds, either through plugins or through MIDI data. Being able to connected to all this various gear, with a standard such as MIDI, allows for me to be in a project, change a sound source and be able to use the exact same MIDI data on the new sound source without a hiccup. I am sure I am one of the few that keeps my event list open in LOGIC the entire time I am recording.
Anything else you’d like to add?
It is a privilege and an honor to be chosen in the top ten for this contest, I’ve seen some amazing rigs out there. As for MIDI, It’s a testament to the standard that it has been widely used this many years later even after the rest of the music industry has changed so dramatically. MIDI is still MIDI and that’s a good thing. As for my wall, I got the idea from several music stores, and figured “why not”. So I had my contractor reinforce the wall to support the weight of 8-10 synths. Thanks to the folks at Gear Sluts for the great idea on where to find the adjustable keyboard arms.
It was during high school when I was about 16 years old so maybe 1991. I had started buying a bunch of analog and digital keyboards at second hand shops. I had Jupiter 6 (one of the two first instruments to have MIDI), a Roland TR909, and I noticed that it had these jacks on the back- MIDI In and Out,
I started researching ways to connect things together. I initially started out with CV/gate connections on my Arp-2600. I began experimenting with CV and controlling the Arp-2600 with the Arp sequencer. Learning the basics about sequencing. Shortly after this I was decided to integrate MIDI into my workflow, and began researching how build my setup around a MIDI sequencer. I got a computer, and also bought two Alesis MMT8 midi sequencers. I spent several months learning this setup, and discovered that MIDI was going to play a big role in my musical productions.
At that time there wasn’t a big Internet presence. So I would read magazines like Future Music and Electronic Musician and they used cover a lot of midi setups of artists that I was heavily following at the time. I learned a lot about how to work with bigger MIDI setups, and soon after this bought my first MIDI interface by MIDIman.
TMA:
You mentioned CV and Gate, it is amazing that there was never a standard for such a simple interface protocol, There were two ways to do it and they both still exist. How important is it to you that MIDI products from 20 years still work with your newer gear?
Richard Devine:
For my work in particular it’s extremely important, I need access to both old and new technology at the drop of a hat. I never know what the client will ask for. I do a lot editing with piano roll editor, and MIDI CC automation. Almost every job that I have worked on has used MIDI extensively, in controlling Virtual Instruments or hardware units in the studio. It’s become the central communication system to all my instruments.
I don’t know what I’d do if it went away. Without MIDI I’d be totally screwed.
by Richard Devine
Sometimes I have to turn in stuff within an hour. I’m working with Google on a pretty hefty virtual reality project and the turn around times and deadlines are incredibly tight and hectic. So in designing my studio I have two setups, the mixing area and editing station. Both areas are completely MIDI’ed together and ready to go. Everything is always armed and ready to record.
I’m really excited because after this Google project is finished I am going to rewire the entire studio and use the new iConnectivity MIO10 systems. I have Cat5 wiring on both the rear and back walls of my studio. I use them now for connectivity to my digital console and to my personal cloud storage work drives. I am going to make the leap to network MIDI, and use the Ethernet connections on my Mac Pro and Macbook Pro. The goal is to have either computer be the controller of all my MIDI devices. With the built in software I can map out different port configurations/filtering very quickly and also store and recall presets. I thank god a company like iConnectivity has stayed up with the latest trends in technology. I haven’t found any other company that is currently addressing MIDI interface design with more modern computers and hardware.
It’s great time to making music with MIDI now.
by Richard Devine
Article Update
Our Friends at Synthtopia let us know that Richard created a free sample library based on samples of the Mysterious Ringing Rocks of Montana.
The Ringing Rocks are a unique geological formation that features rocks that ring musically, when taped with a hammer. They are natural lithophones – instruments that are struck.
A little over 7 years, when we started we were going to be gas and oil services company. The original idea was allow people to use a iPhone to take data from oil fields. We pivoted and thought about products and product that would connect to an iPhone. One of the people in the company was a professional musician. So we decided would be a good place to start because MIDI is from an engineering point of view it’s pretty simple. We ended up having a distribution channel and manufacturing in the music channel so we decided to stay. It’s been a constant evolution of building our brand. Last year at NAMM was the first time that we really started to have brand recognition where people would come to our booth at NAMM and say “hey, we know you and you can solve our problems.”
We think the new MIO series is really the culmination of our goal of developing the most sophisticated MIDI devices on the planet.
We sent Richard two MIO10s because you can connect two MIO10s together and control all the mapping and filtering for both.
All our products have our core MIDI Data management system. One of the advantages of that system is we can send System Exclusive messages to change all the channel and port mappings and filtering. It’s amazing because we use MIDI to control the MIDI managment in our devices so our products are completely MIDI centric.
Another advantage is how efficient MIDI is. In our MIDI management system we process everything within 200 nanoseconds.
Check out this video about using the iOS app iMIDIPatchbay with iConnectivity devices. It’s got lots of great tips about MIDI routing.
Michael Loh from iConnectivity gives a brief overview of the mio series of USB/MIDI interfaces. Not only can you connect multiple computers to mio interfaces…
We have a soft spot for robots…….People who tinker with robots, art installations and circuit bending are right in our wheelhouse They seem to share a passion for pushing the limits of what MIDI can do. Here’s a quick selection of some of our favorite MIDI robots curated from the web.
MMA member Novalia has recently had two very successful partnerships to bring interactive conductive ink installations to SXSW and McDonald’s locations in the Netherlands.
Okay, I admit it. ‘m a sucker for this sort of thing. First, I have been into MIDI my whole life. Second, I lived in Japan for 7 years and have climbed Mt Fuji (twice).Third, I love the combination of MIDI and art especially visuals,
There has always been a connection between astronomy and music. In fact, if you go all the way back to the ancient Greeks, Pythagoras first identified the relationship between musical pitch and numerical ratios. He discovered the ratio, 2:1, of the o
Conlon Nancarrow and impossible music. In an earlier MIDI history blog, we talked about the strange symbiotic relationship between player pianos and MIDI. But one of the things we didn’t talk about was Conlon Nancarrow, the 20th century American composer and his relationship to Black MIDI.
Animusic produces innovative music animation by leveraging MIDI data in creating “virtual concerts”. The animation of graphical instrument elements is generated using proprietary software called MIDImotionTM.
The loudest MIDI instrument ever ? Dana Dolfi has created what is probably the loudest MIDI controlled instrument ever made out of recycled ship, truck and train air horns and steam whistles.
Manipulate the state of your mindusing Light Scape sequences. Enjoy the psychedelic color experience that the eLSD is inducing. It gently pulses light and color in front of your eyes, influencing your brain waves and the state of your mind.
It’s the week before Halloween, and a line stretches down around a corner. Muffled sounds of screams drown out the nervous laughter of the people in line as it creeps forward. As you round the corner and head towards the house, the light in the trees
There has always been a connection between astronomy and music. In fact, if you go all the way back to the ancient Greeks, Pythagoras first identified the relationship between musical pitch and numerical ratios. He discovered the ratio, 2:1, of the octave. That means a string and a string with exactly half the length will sound in perfect consonance. In fact the Greek word for ratio is logos. A perfect fifth has the ratio (logic) of 3:2. A perfect fourth sounds in the ratio of 4:3. So if you look at many popular songs, they are still based on the the same logic or ratios.
In a philosophic theory known as the Harmony of the Spheres, Pythagoras suggested that celestial bodies emit their own unique vibrations based on their orbits,
“As the eyes, said I, seem formed for studying astronomy, so do the ears seem formed for harmonious motions: and these seem to be twin sciences to one another.
by Plato- Republic VII.XII
ALMA Sounds combines radio astronomy captured by the Atacama Large Millimeter/submillimeter Array (ALMA) and music. Astronomers Antonio Hales and Universidad de Chile engineer, Ricardo Finger determined a way to interpret the electromagnetic spectra of the Orion Nebula and transform them into digital sounds.
“The ALMA sound bank is made up of a series of drum shots created from the sound pattern of electromagnetic spectra from the Orion Nebula. The emission lines of molecules present in the Orion Nebula were transformed into musical notes, transposing the ‘chords’ from the skies into sound chords. The sound bank is, as its name implies, a digital library of sounds available to the musical community.”
by Astronomer Antonio Hales
At the Sónar+D festival in Barcelona, Spain between 6 and 18 June 2016, the ALMA sounds project will have a stand with digital instruments with MIDI interfaces enabling the audience to interact directly with the ALMA Sounds cosmic sound bank . The Chilean collective Trimex will also present an artistic representation of the planets in the Solar System.
Here is a selection of songs that were done using the ALMA Sounds sound bank.
The sound bank is is available for free download here.
Dana Dolfi has created what is probably the loudest MIDI controlled instrument ever made out of recycled ship, truck and train air horns and steam whistles.
Dolfi, a pipe-fitter and project manager for Chapman Corp. in Washington, Pennsylvania sets his 3-ton, red, white and blue atop a car trailer and performs at Maker Faires, July Fourth events and graduation parties.
But his contraption (It’s great when you can use an old-timey word like contraption in it’s proper context) creates some limitations in where he can perform. The GAHM is as loud as a jet engine and so Dana marks off a 100 yard perimeter around the instrument and even then recommends ear plugs. It’s no wonder it is so loud as it is powered by a gasoline-powered air compressor and a 620-gallon air tank blows the horns and whistles.
Among the horns and whistles Dolfi has collected are a large horn from the USS Mississinewa, a Naval replenishment oiler; horns off a Coast Guard cutter and an ocean-going dredge; a horn that was used on a California drawbridge; a set of horns from a Great Lakes ore freighter; whistles from the Donora American Steel and Wire Works; an 1890s whistle from a fire hall in Gloucester, Mass.; a whistle from an antique popcorn machine, and several train and towboat horns and whistles.
by Karen Mansfield Staff Writer for the award-winning Observer-Reporter. a daily newspaper headquartered in Washington, Pa., the newspaper has been part of Southwestern Pennsylvania since the early 1800s.
The Great American Horn Machine has performed at lots of Maker Faires and here are a couple of video examples.
Not to be outdone by McDonald’s or Bud Light, Pizza Hut in the UK has worked with TMA member Novalia and released a limited edition DJ Pizza box. DJ Vectra
Novalia creates paper thin self adhesive touch sensors from printed conductive ink and attached silicon micro controller modules. Their control modules use bluetooth MIDI connectivity.
TMA member Novalia has recently had three very successful partnerships to bring interactive conductive ink installations featuring BTLE MIDI to SXSW, McDonald’s locations in the Netherlands and Moogfest 2016.
Novalia creates paper thin self-adhesive touch sensors from printed conductive ink and attached silicon microcontroller modules. Their control modules use Bluetooth MIDI connectivity.
Novalia’s technology adds touch, connectivity and data to surfaces around us. We play in the space between the physical and digital using beautiful, tactile printed touch sensors to connect people, places and objects. Touching our print either triggers sounds from its surface or sends information to the internet. From postcard to bus shelter size, our interactive print is often as thin as a piece of paper. Let’s blend science with design to create experiences indistinguishable from magic.
by Novalia
DJ QBERT: INTERACTIVE DJ DECKS
One of Novalia’s first big projects came about when they worked with DJ Qbert to create the world’s first interactive DJ Decks in an album cover.
QBert’s Kickstarter funded Extraterrestria album featured a set of working Bluetooth MIDI decks and controls that connect to iOs and OSx. Touching the paper connects to the Algoriddim DJAY app, allowing the user to scratch, mix and fade any songs they load into the software. Complete with two decks, a cross fader and an array of SFX buttons beautifully printed onto paper using printed electronics and artwork designed by Morning Breath for QBert, the decks demonstrate the possibilities for interconnectivity between the physical and digital in a way that can enhance user experience.
Bud Light Interactive Wall at SXSW
At SXSW, Novalia partnered with SXSW sponsor Bud Light and Mediacom, to create interactive music walls that connected people to music through touch, connectivity, and data. Up to 15 people could interact with the music simultaneously.
Wasn’t it fun when we were kids to doodle on restaurants’ paper placements with crayons? Well, McDonald’s has introduced a high-tech, musical version of that sort of play with McTrax—a snazzy placemat that acts like a little music production station. TBWA\Neboko in the Netherlands created McTrax.
However there was very little explanation of how MIDI enabled all of this cool technology. The placemat, developed with This Page Amsterdam and Novalia uses conductive ink, a small battery and a thin circuit board with 26 digital touchpoints. The Novalia board sends MIDI signals via BTLE MIDI to a smartphone that has the McTrax music app downloaded on it. The contains audio loops, synths and musical effects and you even sample your own voice.
Recently Kate Stone, CEO of Novalia ,was at Moogfest where she presented some Novalia controllers with DJ Lance Rock (Yo Gabba Gabba!), Mark Mothersbaugh (Devo), Bootsy Collins (Parliament/Funkadelic), Malcolm Mooney (Can), Peter Conheim (Negativland), Dr. Kate Stone (Novalia), Van Partible (Johnny Bravo), and DJ Nanny.
Bud Light, McDonald’s, Yo Gaba Gaba, Devo, Parliament, Novalia conductive ink and BTLE MIDI.
Okay, I admit it. I’m a sucker for this sort of thing. First, I have been into MIDI my whole life. Second, I lived in Japan for 7 years and have climbed Mt Fuji (twice). Third, I love the combination of MIDI and art especially visual art.
So when I saw this I was hooked even before I watched it.
The Beautiful backdrop of the world heritage site Mt. Fuji was used to stage the first Live performance using MIDI controlled LED flying machines, accompanied by Shamisens, the Japanese traditional guitars.This was done so by utilising more than 20 units of these flying machines, flight swarming formations, music, and 16,500 LED lights to combine into a single audio visual extravaganza. Furthermore, we are able to control the flying machines, visual and audio aspects concurrently, using the DMX512.
by Creative Director by TSUYOSHI TAKASHIRO Film Director by SHU SHINKAWA Director of Photography by KANEKO SATOSHI Flying Videography by DISUKE OHKI Music by Tsukagaru-Shamisen, OYAMA-KAI Sound By MANABU NAGAYAMA & MASAKAZU UEHATA Production Company by FPI, Inc. Produced by MicroAd, Inc.
The sky is one of the few remaining frontiers in the new cyber space,It is the objective of this project. I would like to reach out to as many people in various places in the 21st century of fireworks.”
This post was contributed by the well known film composer, Jeff Rona. Jeff was the first president of the MIDI Manufacturers Association and ran the MMA from 1983 until 1992 . Jeff was instrumental (pun intended) in getting MIDI started and gives us an inside look at the beginnings of MIDI.
We recently did an exclusive interview with Jeff where he talks about how he uses MIDI today in his film composition and why “the studio is his instrument”.
In 1982 I was a young composer writing music for theater, dance, and programming synthesizers for a few recording artists to earn money. But an unexpected and odd opportunity came to me that seemed right to try at the time. I was really one of the first people in Los Angeles experimenting with linking desktop computers (a very new thing at the time) with synthesizers. I had a computer mentor of sorts, a scientist from Jet Propulsion Laboratories whose hobby was developing hardware and software to make music. All very experimental – but amazing things were possible with some effort. I learned just enough about writing computer code to be dangerous. It was all purely musical. I was by no means a software expert. But I had a good aptitude for it. I was eventually invited to speak about computers and music at the first TED conference.
There I am at the TED Conference (circled) with the group, courtesy of PANTONE.
I was at a local music store in Hollywood and struck up a casual conversation with a couple of guys from Roland who happened to be there at the time. When I told them what I was doing with synths and desktop computers, they got very excited. Within a couple days I found myself in the office of Tom Beckman, the president of Roland US, explaining my work and background. When he asked me if I wanted a job and could I write code for music software. I lied, basically, and said yes. I became a programmer and instrument designer for Roland that day.Within a few weeks of starting (I quickly got a programming coach to help me get up to speed fast) I had my first official meeting with some of Roland’s top engineers and designers, who were in LA from Roland headquarters in Japan. We hit it off very well right from the start. I had learned a few words of Japanese and did my best to express my deep admiration for their work (one of my guests had designed the TR-808 drum machine!). They brought me two prototype keyboards. They showed me a 5 pin jack on the back each and said “we think this is very useful…we want you to devote all your time to writing software for this.” These were likely the first 2 MIDI instruments in the country. The plan was to develop software to show what could be done with combining keyboards and sequencing. I was blown away. I had already written some software to sequence analog synthesizers with a pre-MIDI computer interface. This was a whole new world.
A few months later I was asked to represent Roland at a small private meeting at the NAMM show to discuss how American musical instrument companies might be able to coordinate their efforts in making MIDI a true standard that was useful, functional, and consistent. I can’t remember everyone at that meeting, but I do remember Bob Moog, Tom Oberheim, Dave Smith, Roger Linn, as well as engineers and designers from Yamaha, Roland, Akai, Casio, Korg, and a few other companies that had gotten a start on MIDI. (This was also the 1983 NAMM show where MIDI was shown to the public for the first time by my Roland US cohort Jim Mothersbaugh).
The agreement around the table was that a strong need existed to create a coalition of all interested parties to help get MIDI off the ground and into wider use by as many musical instrument manufacturers as possible. The issue then was to find someone willing to get this technical cooperative started. Silence. No one had any interest in taking on the rather monumental task of figuring out how to form an organization for musical instrument companies – competitors – to disseminate, share, develop and test this brand new technology on a grand scale. As the newest member of this group I had the least amount of work responsibility. And the idea of bringing all this together was absolutely compelling to me. I spoke up and said I would take on the responsibility to try and get an official MIDI governing body together. I remember Tom Oberheim, who I’d never met before, saying “Fantastic! And who are you?”
Over the next several weeks, the enormity of the task became clearer. While Dave Smith, head of Sequential Circuits, was the man who initially conceived of a universal protocol for all musical instruments, a lot of the nuts and bolts of MIDI happened in Japan – primarily with one engineer at Roland working with one engineer at Yamaha. I had already become close to both of them, and had started helping on the design of Roland’s first MIDI/computer interface, called the MPU-401. So I began an ongoing dialogue with them both to discuss the challenges of making MIDI a universally accepted technology by every interested instrument company in the world. Several companies had already vowed to never touch MIDI for a variety of reasons both technical and political. Some were bigger players in the industry. And a lot of companies simply hadn’t heard about our work yet.
I went to a lawyer in Los Angeles to set up an official not-for-profit corporation to be the official entity for MIDI’s development. It would collect dues (tax free), generate the official technical documents for engineers to follow, and oversee further development to the hardware and software layers of MIDI. I had to think of a name for the group for the incorporation papers, and came up with the MIDI Manufacturers Association – The MMA. By mid 1983 we were off to a good start with about 10 or 12 members. We made a pact to work in tandem with our Japanese counterpart, the Japanese MIDI Standards Committee (the JMSC or just “MIDI Committee” for short).
Most technical standards are overseen by sanctioned governmental committees and highly rigorous legal procedures. All the various digital audio specifications, broadcast standards, time code formats, video formats, the Compact Disc, were all technologies started with the cooperation of private companies working with government standards groups and protocols, and these all took years to complete before they made their way to the public. Many technologies are half obsolete before they even make it to into stores. We didn’t want that, and so we decided to do what we could to steer clear of any governmental oversight. It did cause problems. For example, if MIDI were to have an official logo (like compact discs did), who would decide that a company had implemented MIDI fully and correctly and could display the logo? And what if they didn’t? Could we stop them? Who would make the call, and would it stand up in court? How would future added MIDI protocols be ratified as ‘official’? Would we grant licenses to companies for a fee? Who owned MIDI? While this made some people a bit nervous, we set all those potential worries aside to focus on the best ways to just get MIDI out into the world. The companies there at the beginning had a sense that MIDI would help sell a lot more keyboards – a good incentive to move quickly. Little did anyone know at the time how explosive the success of this technology would be. It was seen then as little more than a technique to help higher end musicians work with multiple keyboards on stage or in the studio. Nothing radical – just easier.
The NAMM show takes place twice per year. Winter NAMM is in Anaheim California, across the street from Disneyland an hour south of Los Angeles. The summer NAMM was usually in Chicago. But this particular year the event had been moved to New Orleans. As the newly appointed head of the MMA (more a coin toss than an election) I gave myself the task of organizing a private meeting there and inviting instrument companies from around the country and throughout Europe to attend a meeting to show what MIDI was, and to try and get the MMA moving forward. Members of the JMSC offered to attend to officially recognize the MMA for all companies using MIDI outside of Japan. There was also at the time a new users group for interested musicians to learn about this cool new MIDI thing. It was run by an LA-based musician named Lachlan Westfall, and we had become good friends. He was also an adept print layout artist, and I was in the midst of translating and editing the 1st edition of the official “MIDI Specification 1.0″ from Japanese to English for MMA members to use as a reference. Lachlan helped me put that together and we agreed to continue helping each other out in different ways. We both spent days poured through music magazines looking for any company we thought might be interested in using MIDI and I sent invitations to come to NAMM to be a part of this new MIDI and MMA movement. Getting rivals and competitors to sit down together was unheard of. Before MIDI there was never a need to discuss anything of mutual benefit. I was hoping to double the size of the organization and maybe get up to 20 or so members that summer.
Uncertain anyone would even attend, I booked a small private meeting room at the New Orleans Hilton, got refreshments, printed up copies of the new MIDI Spec, and put together an itinerary for the meeting. I was incredibly nervous this being the first time I used MMA money for anything. Not only was a lot riding on this, but there were still a number of detractors who didn’t see the MMA getting off the ground. I walked into the room to begin the meeting, and instead of the 20 or 30 people I expected, there were over a hundred – engineers and executives from every instrument company, audio company, and music magazine I’d ever heard of. This was far beyond anything I could have hoped for.
I’d invited Karl Hirano, Yamaha’s chief engineer at the time (and developer of the DX7), who was also the president of the JMSC, to say a few words. He graciously spoke to acknowledge the MMA as the only technical group with the power to develop and ratify new MIDI protocols outside of Japan. By the end of the meeting, all the major instrument companies, as well as young startups were on board. MMA, and MIDI’s development, was in full swing. Some of those little startups there went on to be some of the most successful music and audio companies in the business.
Karl Hirano was a synthesizer engineer for Yamaha in Japan during the great MIDI boom of the early 1980s. In fact, Karl was a member of the team that gathered at the 1983 NAMM Show to discuss the MIDI
There were plenty of kinks along the way, but we developed a working method for rapidly proposing, amending, and approving new elements to MIDI. And while many new and improved implementations for MIDI came from Japan, the one person in my opinion who pushed MIDI forward more than anyone was a young engineer (also from Sequential Circuits) named Chris Meyer. Chris is a full-tilt genius with an incredibly low tolerance for egos, errors, wasted energy, or bullshit of any kind. Serious on the outside, delightful on the inside, he was absolutely incredible to work with, and he kept the rest of the MMA, myself especially, on its toes at all times.
Obviously, MIDI has been a runaway hit far beyond anyone’s wildest expectations at the start. It is ubiquitous. Eventually we did get called up by one of the US governmental technical bodies to tell us that if we didn’t slow down and do things by the book, MIDI was heading for nothing but lawsuits and eventual destruction. We agreed to meet and discuss the option of changing to a different method. It would involve dissolving the MMA and allowing an organization such as AES or SMPTE to take over and run things “properly”. It was an odd meeting – again in a back room at another NAMM show. It was a rather stodgy, unnamed member of that governmental body (wearing two pairs of coke bottle thick glasses – legally blind I imagine, and utterly geekish), Bob Moog, Chris Meyer, one other engineer, and myself. And it was actually a rather brutal meeting. We were lectured like we were children about to crash our bicycles over a cliff, with all the potentially dire consequences listed out for us.
But afterward it was clear to all of us at MMA that we simply had to stay “rogue” or we would have to stop all the amazing change going on right then for the entire music industry. MIDI instrument development had still only been in full swing a few years, but already we were introducing protocols for synchronizing video machines, multi-track systems, lighting boards, automation of all kinds, samplers, patch editors and librarians, and especially computer interfacing and sequencing – and it was really going well. In all of that early rapid development and deployment only a tiny handful of products ever made it to market with real flaws in their MIDI support, which was a major coup for the MMA.
Regardless of how things “should” have been done, we were doing things right, and the music industry was going crazy for it. MIDI brought synthesizers so much further into the mainstream of music production and live performance. In my estimation, no other digital technology, maybe no other technology of any kind, has ever succeeded at the pace and with the success of MIDI on a global scale.
I ran the MMA for 7 years. In the middle of my time there I took a break for a couple years to focus more on my music, but returned to keep things moving as smoothly as possible. But as my work as a musician in recording studios and eventually my composing for film and TV took off, I had to give up my role in the MMA. It was incredibly sad for me to leave, but I was no longer an active developer, having left my job at Roland a few years earlier. Those wonderful geeky people that started the whole thing, virtually all superb musicians in one way or another, had become some of my close friends and favorite people.
These days I attend NAMM shows to find the best new hardware and software for my studio, and I am fortunate enough to still run into a lot of the people that were there from the start. Some of the smartest people I’ve ever met. And we share a smile for something that we can all be very, very proud of.
I know I am.
Updated with a Youtube Interview by Orchestral Tools
We found this excellent Youtube interview by Orchestral Tools and thought it would be a great addition to this article with the first President of the MIDI Manufacturers Association, Jeff Rona.
“Fingers” – guitarist for Compressorhead.
It is equipped with two hands, with a total of 78 fingers
People who tinker with robots, art installations and circuit bending are right in our wheelhouse . They seem to share a passion for pushing the limits of what MIDI can do. Here’s a quick selection of some of our favorite MIDI robots curated from the web.
Eric Singer and the League of Electronic Musical Robots (LEMUR)
When we first started The MIDI Association,we reached out to Eric Singer and he has been a member of our educational advisory panel since the very beginning. Eric founded The League of Electronic Musical Urban Robots, or LEMUR, in in 2000 , LEMUR is a group of artists and technologists developing robotic musical instruments that play themselves. Here an interview with Eric from Motherboard.
Recently Eric did an installation for the LIDO nightclub in Paris.
The Guitarbot is a guitar that can be played by MIDI files.
Eric Singer worked with another member of our educational advisory panel on this project.
Paul Lehrman is the Director of the program in Music Engineering at Tufts University and an adjunct Professor in Computer Science and Mechanical Engineering. Paul has had a long relationship with MIDI and actually released the first all MIDI album “The Celtic Macintosh” in 1986. Paul has also been on our educational advisory panel since it’s inception.
Paul Lehrman showing Herbie Hancock the joy of MIDI.
Eric and Paul worked together on Anthiel’s Ballet Mecanique which was performed at Carnegie Hall and at the National Gallery of Art among many other performances around the world.
Perhaps Eric’s most well known project (although he has worked with They Might Be Giants and many other musicians was he work with Pat Methany on the Orchestrion Project.
Chico Macmurtrie and the Robotic Church
Somehow Brooklyn has become a haven for musical robots and there are a number of robotic MIDI artists working there. One of our favorites is Chico Macmurtrie. His group, Amorphic Robot Works (ARW) created a robotic MIDI driven band of 50 pieces that toured Europe for many years. Now he has “revived” these mechanical “saints” and 35 computer-controlled pneumatic sculptures ranging in size from 12 inches to 15 feet are installed in a former Norwegian Seaman’s Church in Redhook known as The Robotic Church,
While responding to computer language (MIDI), they are anthropopathic in nature and channel air to activate their inner biology.
Matt Steinke’s dense, funny, haunting installations and performances feature everything from animatronic puppetry and meticulous animation to interactive homemade robotic sound apparatuses. Each piece offers an incomplete glimpse into an evocative, elegant, claustrophobic cosmos.
Steinke holds a MFA in Art and Technology Studies from The School of the Art Institute of Chicago. Upon graduation, he received The Illinois Arts Council Fellowship for Interdisciplinary/Computer Art. He received the 2015 New Music USA Project Grant for Composers. His “Tine Organ” instrument was a finalist in the 2015 Margaret Guthman Musical Instrument Competition. His work has been featured in Wired, Artweek LA, The Village Voice, The San Francisco Bay Guardian, Spin, Rolling Stone, Keyboard Magazine, Drum Magazine and on the cover of Tape Op. As a founding member of the Northwest noise-punk bands, Mocket and Satisfact, he has made over a dozen recordings for Kill Rockstars, K Records, and Up Records and has performed with his homemade robotic musical instrument ensemble, Octant, across the US.
His Tine organ is MIDI controlled.
Tesla Coils, Robotic Drummers and MIDI, what’s not to like!
Designers of the original Singing Tesla Coils, ArcAttack specializes in providing innovative entertainment, Tesla coil fabrication and creating unique things.
Compressorhead has performed at festivals around the world doing covers of classic rock songs, but have their own studio album planned for this year.
Georgia Tech Center for Music Technology
Gil Weinberg is the founding director of the Georgia Tech Center for Music Technology, where he established the M.S. and Ph.D. programs in Music Technology. He developed robots that interact with humans in uncannily well…. human ways. The Georgia Tech Center is really doing some interesting stuff with music technology!
Shimon has “eyes” that can respond to the conductor’s baton.
Gil also worked to develop a prosthetic robotic hand for Jason Barnes, a drummer who lost an arm in a freak accident. Though technically not MIDI, it is a truly inspiring story.
Guthman Musical Instrument Competition
If you don’t follow it, you should. The Guthman musical instrument competition is held every year and there are always really cool and unique instruments that show up like the one below.
5 Different Sites and 5 different views of the same song, but they are all driven by MIDI.
Multimedia artist Andy Fillebrown creates visualizations of classical, public domain musical compositions. His YouTube channel, audiosculptures, is filled with 100+ spellbinding journeys through a tunnel of glowing, pulsing notes.
by The Kid Should See This
Our own Eric Singer (Eric is on our educational advisory panel) posted this next video which uses three MIDI driven robots to play the Rimsky-Korsakov piece.
Not to be outdone by a machine (pun intended) here is yehezkelraz performing the piece on an Ableton Push.
3
Here’s another look at this piece from the Black MIDI point of view. If you don’t know about Black MIDI check out this post.
Here is Hollywood Virtual Audio’s Orchestral version highlighting a different way to use MIDI
If you want just plain old MIDI, here are links to one of the most popular sites on the web for classical MIDI files.
Nikolay Rimsky-Korsakov (composer 1844-1908) – Play or download MIDI files from Classical Archives (classicalarchives.com), the largest and best organized classical music site on the web.
In an earlier MIDI history blog, we talked about the strange symbiotic relationship between player pianos and MIDI. But one of the things we didn’t talk about was Conlon Nancarrow, the 20th century American composer who lived most of his life in Mexico and wrote most of his pieces for player pianos because the music was impossible for humans to play. Here is a really nice overview from Adam Neely.
Black MIDI – Is it music, art, both or just kids trying to break their computer?
In 2011 a Japanese Youtube named kakakakaito1998 uploaded the first Black MIDI file.
Black MIDI is a strange combination of music, visuals created by notation or often with the use of a music learning game called Synthesia. The idea is to put so many notes in the pieces that the notes themselves become a type of synthesis and also create stunning visual effects.
This original video has over 220,000 views, but that’s small compared to something like Bad Apple from the Blacker -SuperMariobros2. This video has over a million views and over 8 million notes!.
Black MIDI continues to rise in popularity, Look at this trend from Google about Black MIDI searches.
Below are links to several article about Black MIDI that provided information for this article, links to some of the more famous Black MIDI YouTube channels and a link to the Synthesia game. Maybe you’d like to try Black MIDI for yourself!
I like to make ridiculously impossible piano MIDIs and then kill my computer with those midis. Me and TheTrustedComputer make the best quality impossible mid…
Black MIDI, computer stuff, etc. Senior high school student. Currently in the BMT. I edit my videos. Nobody wants to watch an hour of lag, lmao. I do not rea…
Featuring: Dave Smith, George Duke, Tom Oberheim, Alan Parsons, Jordan Rudess, Craig Anderton. For 30 years, MIDI has been at the forefront of music technology even as musical trends change. Watch a star-studded panel of MIDI instrument designers and musicians talk about the past, present, and future of MIDI with MMA CEO/President Tom White.
MIDI 30th Anniversary Exhibit
Here’s how we celebrated the incredible history of MIDI at 2013 NAMM:
We recreated the 1983 MIDI launch with the first two MIDI keyboards, the Roland Jupiter-6 and Sequential Prophet 600, connected via MIDI.
We displayed historical documents about the development of MIDI and the MMA, and some well-known MIDI products from the past.
We displayed some of the latest products using MIDI technology, and gave attendees the opportunity to jam with each other.
We used a 30 year-old Commodore 64 as a MIDI sequencer for an iPad (running AniMoog)
C64 meets iPad
2013 Booth Photo (“Present”)
2013 Booth Photo (“Past”)
1983 NAMM Photo
We handed out nearly 5000 “MIDI 30 Years” commemorative pins, and five lucky people people seen wearing the pins received prizes, including one Gibson Les Paul Standard Grand Prize.
MMA Annual General Meeting 2013
The Annual General Meeting of MMA Members began with a General Session at 9:00 am for members and invited guests, followed by a closed (members-only) Technical Session at 10:00 am.
MMA Status Report
MMA President Tom White explained how MMA’s mission is providing for interoperability of MIDI products, and explained the various technical proposals and business development projects being managed by MMA on behalf of the industry.
HD Protocol
MMA AVBTP Payload Types – The new IEEE AVB Transport Protocol specification (IEEE-1722) includes a reference to MMA for specifications now under development that define the transport of MIDI and HD Protocol.
Updated MIDI Electrical spec – A proposal for MMA to publish a new circuit diagram using current components and practices and allowing for 3.3V power supplies.
IEC MIDI Specification – A proposal is under consideration to submit some portion of the MIDI Specification to IEC for standardization, to improve recognition as a standard by emerging countries.
Universal SysEX ID for iOS OMAC – A proposal to improve communication among iOS music apps.
MIDI Home Control Specification – A proposal to provide interoperability among home A/V devices.
Web MIDI API – The web standards organization (W3C) has developed technology to enable browser-based audio apps, including support for MIDI input/output. Florian Bomers (Bome Software) presented an overview of the Web MIDI API, and encouraged MMA members to advocate for browser support.
Bluetooth LE – Tom mentioned a few companies interested in using this technology for MIDI.
Logo and Trademark Protection – MMA is working with AMEI in Japan and CMIA in China to prevent dilution of the meaning of “MIDI” in China caused by registered marks being used on non-MIDI products.
Tom also reported on MMA’s “MIDI Makes Music” promotional campaign for 2013, and the Technical Grammy to be awarded February 9th to Dave Smith and Ikutaro Kakehashi for the invention of MIDI.
Following the General Session the MMA members discussed each active technical proposal in more detail.
For years now the Brent Ross, Creator & Designer of the DC Cemetery has been creating some of best Halloween haunts by using all kinds of technology including MIDI. Here is an article form 2008 updated with the 2014 video and information on where you can see the 2017 version DC Cemetery.
It’s the week before Halloween, and a line stretches down around a corner. Muffled sounds of screams drown out the nervous laughter of the people in line as it creeps forward. As you round the corner and head towards the house, the light in the trees casts an eerie shadow on the ground. You find it hard to contain your own inner-child’s excitement as you near the house. You can feel your pulse quicken and spine tingle as you see an mysterious fog floating low on the ground. You peer around the corner of the display only to witness a ghastly site: a putrefied skeleton rises from his coffin as nearby a ghost floats above.
As you take in the scene, suddenly a face rises from a vat of bubbling green goo… You walk on a bit and stifle a scream as you see a man struggling to escape from a crypt, being held back by skeletal occupants. Suddenly, the music swells and a 14-foot tall grim reaper stands up, waving his arms, and starts speaking… to you. You sub-consciously take a step backwards, as it is it all terrifyingly real.
Are we at the latest haunted house at Disneyland? Or perhaps one of those terrifying dark rides at Universal Studios? No, we’re in the Silicon Valley suburban enclave of Mountain View, California, home of Brent Ross, who works as at a financial advisor for most of the year. But come Halloween, he is the architect of an astonishingly realistic, fully animated display that creates massive lines that snake around the block and delights the thousands who attend. An Industrial Design major by training, Brent brings his considerable creativity to bear on his amazing Halloween displays, designing and building these spectacular haunts for the terror and enjoyment of the community.
Brett grew up like many kids excited about the notion of a good scare. He got his start making little scenes using a black light and a bowl of dry ice fog on his porch, but soon it evolved into a black plastic maze, then a wood framed cavern, and finally, the steel and wood monstrosity that covers the entire front yard of his parent’s house.
Although he delights in the reaction of the people who come to see his display, the real motivation for him is the challenge of creating an original display that is better and better each year. It’s a ton of work, and a huge strain on his family life, but the challenge of adding new aspects to the attraction – typically 1 or 2 new props and as many as 5 or 6 changes to the display – is simply too exciting to resist.
How impressive are his displays? Amazing enough to pocket the $50,000 first prize in a nation-wide Halloween display competition, that’s how. But even more amazing is the incredible way he goes about controlling this amazingly ambitious show… entirely from a single computer… using MIDI.
Most people think of MIDI as a music technology. And rightly so, since the “M” in MIDI does standard for Music, as in Music Instrument Digital Interface. However, many people don’t understand that there are several other “flavors” and uses of MIDI that allow you to do all sorts of things, including controlling an entire show like this one. In the case of Brent’s Halloween extravaganza, MIDI was something he stumbled upon one year after trying to synchronize a number of pneumatic actuators to control the motion of one of his Halloween props. A neighbor, Dave Fredrichs, stopped by while he was setting up and said “You should use MIDI for that”. His initial response was “What’s MIDI?” but once he experimented a bit with it, he realized that there was simply no better way to accomplish his ghoulish goals: “Dave to opened my eyes to the possibility of using MIDI, using real-time recording of musical notes and converting them into electrical signals that could then be used to turn pneumatic valves on and off. Before that day, I was hard-coding microcontrollers, something that was extremely difficult to do with multiple elements (props) running simultaneously, and then trying to sync audio to the mouth movements.”
The alternatives for Brett are far scarier than MIDI: to accomplish much the same functionality, he had to use dedicated industrial programmable controllers, which would control the pneumatic solenoids using a somewhat cryptic (no pun intended) step-by-step programming protocol, entering the duration and delays of every event manually using unsynchronized timers to control the sequences, and then being left with the daunting task of locking up all of this motion to the audio soundtrack that went along with the show.
The notion of using MIDI to control physical devices is actually quite well embraced in some markets. The MIDI “Show Control” protocol supports the automation of lighting, stage effects like fog machines, and other elements of stage craft. In Japan, MIDI protocol has even been adapted to control servo control motors in robots. And it all makes sense since MIDI is really just a descriptive protocol that describes a performance event.
MIDI describes these perfromance events by breaking down an action into separate parts: when a key is pressed on a keyboard, a MIDI message is generated that says which key was pressed (the MIDI note number / MIDI note on event), and how quickly it was pressed (the MIDI velocity), and in some cases how hard the key was pressed (MIDI aftertouch message). Some time later when you release the key, a MIDI Note Off message is generated, and any clever piece of software can easily determine the interval between the time the key was pressed and the key was released. Link a bunch of events together and you have a song… or a scare!
This “performance description language” is what makes MIDI so incredibly popular amongst musicians – you’ve recorded all the “mechanical” aspects of the performance, allowing you to go back later and manipulate those aspects of the performance, fixing notes, durations, velocity, or modifying the performance by adding real-time controllers, much like going through a rough draft document and making edits with your word processor.
In the context of Halloween, MIDI is the perfect tool for controlling performance of a different kind allowing you to individually control pneumatic solenoids using MIDI messages, letting you individually control each movement as well as editing the “performance” of those movements until you have the most natural and realistic motion possible. In Brett’s case, he uses MIDI on messages to close a pneumatic solenoid valve that then moves a cylinder by filling it with air from a compressor. When a Note Off message is received, then the solenoid opens and the air is exhausted, causing the cylinder to retract. By the same token MIDI messages can be used to turn on or off smoke machines or strobe lights… the possibilities really are endless, and entirely practical.
There is another aspect that makes MIDI attractive for using it beyond the paradigm of music performance, and that is the fact that MIDI technology and tools are well understood, easy to use, and remarkably affordable. In fact, it was the affordability of the solution that intrigued Brett once he understood what it could do for him: “The inspiration to convert to a MIDI-based control system came when I had a large Grim Reaper prop sitting in the garage for 2 years. I was saving up to buy a 50 output sequential microcontroller that costs over $2000 – and it didn’t have real time programming! Dave was inspired by the mechanical aspects of the prop and asked if he could “tinker” with the prop to get it running. He came back a week later with a cart loaded with a PC, MIDI keyboard, and a MIDI to parallel converter board mounted to a piece of plywood. Within an hour or so of wiring and playing with note assignments, the Reaper came to life and I was sold.”
The benefit of MIDI to Brett is that it enables him to use affordable “off-the-shelf” tools to orchestrate all this magic in his award winning shows. For example, the “central command” of his show is a garden-variety PC running Steinberg’s Cubase MIDI sequencing software.
Cubase software, or any MIDi sequencer for that matter, is designed to record, play, and edit a MIDI performance with extensive editing capabilities. Modern sequencers, called Workstations, also include recorded audio, turning the software into a comprehensive system for all aspects of a musical performance. Since the MIDI messages can be also used to control movements of characters instead of describe a musical performance, Brett is able to use the audio aspects of Cubase to manage the sound effects and soundtrack of the display, while using the MIDI aspects to control all the movements.
To control the hundreds of pneumatic cylinders throughout his display that make his creations move, Brett uses readily available “MIDI to Switch” technology that allows you to send MIDI messages to a relay board that closes switches in response to the particular “note on” or “note off” event. Although Brett uses control boards from a company called SD, other solutions are available such as from companies like Doepfer of Germany and MIDI Solutions in Canada.
“I recommend the SD MIDI converter boards to the point of where I now distribute their product for them on www.dcprops.com. I find the SD products are the most cost efficient and expandable of the solutions out there, and their tech support for these types of applications is far superior to others. Add to that the fact they are designed and manufactured here in California so shipping is a lot faster then the overseas alternatives. SD has also come up with a dimmer pack that expands the output so you can use it for dimming 110V lighting!”
Besides enabling him to use off-the shelf technology to control his shows, the simplicity of MIDI technology means his entire show is controlled from one rack of gear, allowing him to easily move his “control central” from place to place and store it in the off-season.
Brett’s rig consists of an enclosed, wheeled server rack with a series of 1U rack mount audio amplifiers (the blue boxes in the picture), multiple surge suppressors, a rack mount PC running Windows with two M-Audio DELTA 1010 and one 410 audio card, providing 24 mono audio outputs. The cards also provide the MIDI I/O to connect to the MIDI to switch converters that control the pneumatics via 24V solenoids.
Inside the slide mount drawer that houses the MIDI to Switch converter boards that provide a total of 256 24V DC outputs to drive the pneumatic solenoids, all powered by a 20-amp 24V DC power supply and battery backup power source. That was particularly important in the old days when we were blowing breakers every 20 minutes due to the overloading the electrical circuits with too much lighting and air compressors!
Even though Brett doesn’t use MIDI for making music, it certainly empowers his creativity while providing a reliable, cost-effective solution that delights the community year after year: “In my opinion there are many advantages to using MIDI, starting with giving you a centralized control hub and letting you easily perform a looped show. MIDI provides cost effective expansion capabilities, and the excellent real-time control provided by sequencers like Cubase combined with an easy to use interface makes programming the show very easy. Add to that the fact I have control over all aspects of automation and the soundtrack with support for multiple channels of audio and you can see why MIDI is a great solution for me.
There are challenges though. Comments Brent: “Since MIDI is designed to make use of a musical keyboard, using regular switches to trigger MIDI events is difficult, although he is working with SD Designs to create a “Switch to MIDI” converter to use on the show. Also, since the workstation sequencer runs on a computer, there are the standard issues of using a computer that presents its own set of issues when compared with the single-purpose industrial controllers. This is because no computer is ever as stable or reliable as a dedicated hardware device, but in this case the versatility of MIDI + audio makes up for it.”
MIDI may be a little scary to some people, but with a little effort you can easily learn to exploit all the power and convenience this technology has to offer. However, for some people, MIDI will always be scary, especially when it is enjoyed in the form of a remarkable Halloween display in Northern California. Cue the lightning sounds… and be aware of what lurks in the vat of bubbling ooze….