30 years of MIDI: a brief history

MIDI: making a DIN for 30 years.
MIDI: making a DIN for 30 years.

Although it's debatable whether we've hit the 30th anniversary of MIDI just yet - the BBC has published an article to mark the occasion on the basis that the Prophet-600, the first synth to feature the standard, was released in December 1982, but the all-knowing MIDI Manufacturers Association is keeping its celebratory powder dry for the 2013 Winter NAMM show, as this marks 30 years since MIDI's proper public launch - what's for certain is that it has had a huge impact on hi-tech music making.

In fact, the MIDI (Musical Instrument Digital Interface) protocol has become the dominant method of connecting pieces of electronic musical equipment. And when you consider the previous standard you have to say that MIDI arrived at just the right time.

The control voltage (CV) and gate trigger system used on early analogue synths was severely limited in its scope and flexibility. Analogue synths tended to have very few features that could be controlled remotely, relying as they did on physical knobs and sliders, patch cables and manual programming.

Furthermore, there was no universal standard for the way CV control should work, complicating the process when interfacing between products from different manufacturers. The majority of vintage CV-controlled synths can now be adapted with a CV-to-MIDI converter, so you can use MIDI to control them.

Dave Smith, founder of Californian synth legend Sequential Circuits and now head of Dave Smith Instruments, anticipated the demand for a more powerful universal protocol and developed the first version of the MIDI standard, which was released in 1983. With the increasing complexity of synths, and as the music industry shifted towards digital technology and computer-based studios, the MIDI setup took off and became the standard for connecting equipment.

How it works

Absolutely no sound is sent via MIDI, just digital signals known as event messages, which instruct pieces of equipment. The most basic example of this can be illustrated by considering a controller keyboard and a sound module. When you push a key on the keyboard, the controller sends an event message which corresponds to that pitch and tells the sound module to start playing the note. When you let go of the key, the controller sends a message to stop playing the note.

Of course, the MIDI protocol allows for control over more than just when a note should be played. Essentially, a message is sent each time some variable changes, whether it be note-on/off (including, of course, exactly which note it is), velocity (determined by how hard you hit the key), aftertouch (how hard the key is held down), pitchbend, pan, modulation, volume or any other MIDI-controllable function.

The protocol supports a total of 128 notes (from C five octaves below middle C through to G ten octaves higher), 16 channels (so that 16 separate devices can be controlled per signal chain, or multiple devices assigned the same channel so they respond to the same input) and 128 programs (corresponding to patches or voice/ effect setting changes). MIDI signals also include built-in clock pulses, which define the tempo of the track and allow basic timing synchronisation between equipment.

The other major piece of the jigsaw is the SysEx (System Exclusive) message, designed so that manufacturers could utilise MIDI to control features specific to their own equipment. In order to control a SysEx function, a manufacturer-specific ID code is sent. Equipment which isn't set up to recognise that particular code will ignore the rest of the message, while devices that do recognise it will continue to listen.

"The MIDI protocol allows for control over more than just when a note should be played."

SysEx messages are usually used for tasks such as loading custom patches and are typically recorded into a sequencer using a 'SysEx Dump' feature on the equipment.

MIDI information was originally sent over a screened twisted pair cable (two signal wires plus an earthed shield to protect them from interference) terminated with 5-pin DIN plugs. However, this format has been superseded to some extent by USB connections, as we'll discuss later. No waves or varying voltages are transmitted since MIDI data is sent digitally, meaning that the signal pins either carry a voltage or none at all, corresponding to the binary logical values 1 and 0.

These binary digits (bits) are combined into 8-bit messages. The protocol supports data rates of up to 31,250 bits per second. Each MIDI connection sends information in one direction only, meaning two cables are needed if a device is used both to send and receive data (unless you're working over USB that is).

In addition to the expected IN and OUT connections, most MIDI devices also have a THRU port. This simply repeats the signal received at the IN port so it can be sent on to other devices further down the chain. Devices may be connected in series and, for the largest MIDI setups, an interface with multiple output ports may be used to control more than 16 separate chained devices.

Becoming a standard

The key feature of MIDI when it was launched was its efficiency: it allowed a relatively significant amount of information to be transmitted using only a small amount of data. Given the limitations of early '80s digital data transmission methods, this was essential to ensure that the reproduction of musical timing was sufficiently accurate.

Manufacturers quickly adopted MIDI and its popularity was cemented by the arrival of MIDI-compatible computer hardware (most notably the built-in MIDI ports of the Atari ST, which was released in 1985). As weaknesses or potential extra features were identified, the MIDI Manufacturers Association updated the standard regularly following its first publication.

The most notable updates - Roland MT-32 (1987), General MIDI (1991) and GM2 (1999), Roland GS (1991) and Yamaha XG (1997-99) - added further features or standards, generally without making previous ones obsolete. It's questionable just how relevant the majority of these standards are to digital musicians and producers, since most of them relate in large part to standardising the playback of music distributed in MIDI format. Unless you intend to distribute your music as MIDI files, most of them probably won't affect you.

Right on time

The most common criticisms of the MIDI protocol relate to timing issues. Although MIDI was efficient by the standards of the early '80s, it is still undeniably flawed to some extent. There is some degree of jitter (variation in timing) present in MIDI, resulting in discernible sloppiness in recording and playback.

Perhaps even more obvious to most of us is latency, the delay between triggering a function (such as a sound) via MIDI and the function being carried out (in this case the sound being reproduced). The more information sent via MIDI, the more latency is created. It may only be in the order of milliseconds, but it's enough to become noticeable to the listener.

Even more problematic is the fact that most of us use MIDI in a computer-based studio and each link in the MIDI and audio chain could potentially add to the latency. This could either be due to software (drivers, DAWs, soft synths) or hardware (RAM, hard drives, processors) but the end result is sloppy timing. The blame cannot be laid entirely at the door of MIDI, but the weaknesses of multiple pieces of MIDI equipment combined with all the other sources of timing error can have a significant detrimental effect on the end result.

Most new MIDI equipment is supplied not only with traditional 5-pin DIN connections but with standard Type A or B USB ports that allow direct connection to your computer. However, USB is not the solution to all your MIDI timing problems. Despite the higher data transfer rates possible over USB, latency is actually higher than over a standard DIN-based MIDI connection. Furthermore, jitter is significantly higher when using MIDI over USB, leading to unpredictable inaccuracies in timing.

Beyond MIDI

It's clear that while MIDI has been massively important to the development of music technology over the last 25 years, it does come with a few major weaknesses. One heavily researched alternative, the Zeta Instrument Processor Interface protocol proposed in the mid-'90s, failed to gain support from manufacturers and never saw commercial release. However, the same development team helped to develop the OpenSound Control (OSC) protocol used by the likes of Native Instruments' Reaktor and Traktor and the Max/MSP and SuperCollider development environments.

This is a much higher bandwidth system which overcomes many of the timing issues of MIDI, most notably by transmitting information with built-in timing messages as quickly as possible through high-bandwidth connections rather than relying on the real-time, event messages used by MIDI devices, which just assume that timing is correct and respond to each message as soon as it's received.

One significant barrier to the development of a universal protocol for contemporary music equipment is that there is so much variation between equipment. With so many different synthesis methods, programming systems, levels of user control and forms of sound manipulation available on different pieces of gear, it's unlikely that any universal system for their control is possible.

However, as computer processing and interfacing technologies have developed so rapidly since the early '80s, perhaps the solution lies not with updating or replacing MIDI but rather with placing greater onus on manufacturers and software developers to come up with their own powerful proprietary DAW-based control systems operating via existing USB, FireWire or even over Ethernet connections or wirelessly.

Future Music

Future Music is the number one magazine for today's producers. Packed with technique and technology we'll help you make great new music. All-access artist interviews, in-depth gear reviews, essential production tutorials and much more. Every marvellous monthly edition features reliable reviews of the latest and greatest hardware and software technology and techniques, unparalleled advice, in-depth interviews, sensational free samples and so much more to improve the experience and outcome of your music-making.