Musical Instrument Digital Interface
“MIDI” redirects here. For other uses, see MIDI (disambiguation).
MIDI (Musical Instrument Digital Interface) is an industry-standard protocol that enables electronic musical instruments, computers and other equipment to communicate, control and synchronize with each other.
Note names and MIDI note numbers.
MIDI does not transmit an audio signal or media — it simply transmits digital data “event messages” such as the pitch and intensity of musical notes to play, control signals for parameters such as volume, vibrato and panning, cues and clock signals to set the tempo. As an electronic protocol, it is notable for its success, both in its widespread adoption throughout the industry, and in remaining essentially unchanged in the face of technological developments since its introduction in 1983.
By the end of the 1970s, electronic musical devices were becoming increasingly common and affordable. However, devices from different manufacturers were generally not compatible with each other and could not be interconnected. Different interfacing models included:
• analog control voltages at various standards (such as 1 volt per octave, or the logarithmic “hertz per volt”)
• analog clock, trigger and “gate” signals (both positive “V-trig” and negative “S-trig” varieties, between -15V to +15V)
• proprietary digital interfaces such as Roland Corporation’s DCB (digital control bus) and Yamaha’s “keycode” system.
In an attempt to find a way forward from this situation, audio engineer and synthesizer designer Dave Smith of Sequential Circuits, Inc. proposed the MIDI standard in 1981 in a paper to the Audio Engineering Society. The proposal received widespread enthusiasm within the industry, and the MIDI Specification 1.0 was published in August 1983. Today, Dave Smith is generally regarded as the “Father of MIDI” and MIDI technology has been standardized and is maintained by the MIDI Manufacturers Association (MMA).
All official MIDI standards are jointly developed and published by the MIDI Manufacturers Association (MMA) in Los Angeles, California, USA (http://www.midi.org), and for Japan, the MIDI Committee of the Association of Musical Electronic Industry (AMEI) in Tokyo (http://www.amei.or.jp). The primary reference for MIDI is The Complete MIDI 1.0 Detailed Specification, document version 96.1, available only from MMA in English, or from AMEI in Japanese.
The MIDI Show Control (MSC) protocol (in the Real Time System Exclusive subset) is an industry standard ratified by the MIDI Manufacturers Association in 1991 which allows all types of media control devices to talk with each other and with computers to perform show control functions in live and canned entertainment applications. Just like musical MIDI (above), MSC does not transmit the actual show media — it simply transmits digital data providing information such as the type, timing and numbering of technical cues called during a multimedia or live theatre performance.
Almost all music recordings today use MIDI devices. In addition, MIDI is also used to control hardware including recording devices and sound effects modules, as well as live performance equipment such as stage lights and effects pedals.
MIDI allows computers, synthesizers, MIDI controllers, sound cards, samplers and drum machines to control one another, and to exchange system data.
MIDI was a major factor in bringing an end to the “wall of synthesizers” phenomenon in 1970s-80s rock music concerts, when keyboard instrument performers were sometimes hidden behind banks of various instruments. Following the advent of MIDI, many synthesizers were released in rack-mount versions, enabling performers to control multiple instruments from a single keyboard.
Another important result of MIDI has been the development of hardware and computer-based sequencers, which can be used to record, edit and play back performances. In the years immediately after the 1983 ratification of the MIDI specification, MIDI interfaces were released for the Apple Macintosh, Commodore 64, and the PC-DOS platform, allowing for the development of a market for powerful, inexpensive, and now-widespread computer-based MIDI sequencers.
Synchronization of MIDI sequences is made possible by the use of MIDI timecode, an implementation of the SMPTE time code standard using MIDI messages, and MIDI timecode has become the standard for digital music synchronization.
A number of music file formats have been based on the MIDI bytestream. These formats are very compact; a file as small as 10 KB can produce a full minute of music or more due to the fact that the file stores instructions on how to recreate the sound based on synthesis with a MIDI synthesizer rather than an exact waveform to be reproduced. A MIDI synthesizer could be built into an operating system, sound card, embedded device (eg. hardware-based synthesizer) or a software-based synthesizer. The file format stores information on what note to play and when, such as other important information such as possible pitch-bend during the envelope of the note or the note’s velocity.
This is advantageous for applications such as mobile phone ringtones, and some video games, however may be a disadvantage to other applications in that the information is not able to guarantee an accurate waveform will be heard by the intended listener, because each MIDI synthesizer will have it’s own methods for producing the sound from the MIDI instructions provided. One example is that any MIDI file played back through the Microsoft MIDI Synthesizer (included in any Windows Operating System) should sound the same or similar, however when the same MIDI bytestream is outputted to a synthesizer on a generic sound card or even a MIDI synthesizer on another Operating System, the actual heard & rendered result may vary due to the fact that the sound cards synthesizer won’t reproduce the exact sounds of another synthesizer.
One clear example of this is how MIDI-based mobile phone ring tones sound different on a handset than when previewed on a PC. In the same way, most modern software synthesizers can handle MIDI files but could render them completely different from any another synthesizer, especially since most modern software synthesizers such as a VST Instrument tend to allow the loading of different patches and the modification of these patches to create different sounds for each MIDI input.
The term “MIDI sound” has often been used as a synonym for “bad sounding computer music”, but this reflects a misunderstanding: MIDI does not define the sound, only the control protocol. This is probably a result of the poor quality sound synthesis provided by many early sound cards, which relied on FM synthesis instead of wavetables to produce audio.
MIDI connector diagram
All MIDI In and MIDI Out connectors are part of a MIDI interface. A MIDI interface moves internal binary data to the MIDI Out connector for transmission to another device’s MIDI In connector, in MIDI message form. It also receives incoming MIDI messages arriving on the MIDI In connector (from another device’s MIDI Out connector) into internal binary data. Many MIDI compatible instruments have a MIDI Thru connector, which can be used to connect a second instrument and pass along MIDI data received by the MIDI In connector of the first instrument. Such chaining together of instruments via MIDI Thru ports is unnecessary with the use of MIDI “patch bay,” “mult” or “Thru” modules or boxes consisting of a MIDI In connector and multiple MIDI Out connectors to which multiple instruments are connected. Physically MIDI connectors are DIN 5/180° connectors.
All MIDI compatible instruments have a built-in MIDI interface. Some computers’ sound cards have a built-in MIDI Interface, whereas others require an external MIDI Interface which is connected to the computer via the game port, the newer DA-15 connector, a USB connector or by FireWire.
MIDI message interoperability
All MIDI compatible controllers, musical instruments, and MIDI-compatible software follow the same MIDI 1.0 specification, and thus interpret any given MIDI message the same way, and so can communicate with and understand each other. For example, if a note is played on a MIDI controller, it will sound at the right pitch on any MIDI instrument whose MIDI In connector is connected to the controller’s MIDI Out connector.
How MIDI channel messages work
When a musical performance is played on a MIDI instrument (or controller) it transmits MIDI channel messages from its MIDI Out connector. A typical MIDI channel message sequence corresponding to a key being struck and released on a keyboard is:
1. The user presses the middle C key with a specific velocity (which is usually translated into the volume of the note but can also be used by the synthesiser to set characteristics of the timbre as well). —> The instrument sends one Note-On message.
2. The user changes the pressure applied on the key while holding it down – a technique called Aftertouch (can be repeated, optional). —> The instrument sends one or more Aftertouch messages.
3. The user releases the middle C key, again with the possibility of velocity of release controlling some parameters. —> The instrument sends one Note-Off message.
Note-On, Aftertouch, and Note-Off are all channel messages. For the Note-On and Note-Off messages, the MIDI specification defines a number (from 0–127) for every possible note pitch (C, C#, D etc.), and this number is included in the message.
Other performance parameters can be transmitted with channel messages, too. For example, if the user turns the pitch wheel on the instrument, that gesture is transmitted over MIDI using a series of Pitch Bend messages (also a channel message). The musical instrument generates the messages autonomously; all the musician has to do is play the notes (or make some other gesture that produces MIDI messages). This consistent, automated abstraction of the musical gesture could be considered the core of the MIDI standard.
MIDI file formats
Standard MIDI File (SMF) Format
MIDI messages (along with timing information) can be collected and stored in a computer file system, in what is commonly called a MIDI file, or more formally, a Standard MIDI File (SMF). The SMF specification was developed by, and is maintained by, the MIDI Manufacturers Association (MMA). MIDI files are typically created using computer-based sequencing software (or sometimes a hardware-based MIDI instrument or workstation) that organizes MIDI messages into one or more parallel “tracks” for independent recording and editing. In most sequencers, each track is assigned to a specific MIDI channel and/or a specific General MIDI instrument patch. Although most current MIDI sequencer software uses proprietary “session file” formats rather than SMF, almost all sequencers provide export or “Save As…” support for the SMF format.
An SMF consists of one header chunk and one or more track chunks. There exist three different SMF formats; the format of a given SMF is specified in its file header. A Format 0 file contains a single track and represents a single song performance. Format 1 may contain any number of tracks, enabling preservation of the sequencer track structure, and also represents a single song performance. Format 2 may have any number of tracks, each representing a separate song performance. Sequencers do not commonly support Format 2.
Large collections of SMFs can be found on the web, most commonly with the extension .mid. These files are most frequently authored with the assumption that they will be played on General MIDI players.
MIDI Karaoke File (.KAR) Format
MIDI-Karaoke (which uses the “.kar” file extension) files are an “unofficial” extension of MIDI files, used to add synchronized lyrics to standard MIDI files. SMF players play the music as they would a .mid file but do not display these lyrics unless they have specific support for .kar messages. These often display the lyrics synchronized with the music in “follow-the-bouncing-ball” fashion, essentially turning any PC into a karaoke machine.
MIDI-Karaoke file formats are not maintained by any standardization body.
XMF File Formats
The MMA has also defined (and AMEI has approved) a new family of file formats, XMF (eXtensible Music File), some of which package SMF chunks with instrument data in DLS format (Downloadable Sounds, also an MMA/AMEI specification), to much the same effect as the MOD file format. The XMF container is a binary format (not XML-based, although the file extensions are similar). See the main article Extensible Music Format (XMF).
RIFF-RMID File Format
On Microsoft Windows, the system itself uses RIFF-based MIDI files with the .rmi extension. Note, Standard MIDI Files per se are not RIFF-compliant. A RIFF-RMID file, however, is simply a Standard MIDI File wrapped in a RIFF chunk. By extracting the data part of the RIFF-RMID chunk, the result will be a regular Standard MIDI File.
In recommended practice RP-29 (), the MMA defined a method for bundling one Standard MIDI file (SMF) image with one Downloadable Sounds (DLS) image, however, this method was obsoleted by the introduction of the Extensible Music Format (XMF), which should be used for this purpose.
 MIDI usage and applications
Main article: MIDI usage and applications
Extensions of the MIDI standard
Many extensions of the original official MIDI 1.0 spec have been standardized by MMA/AMEI. Only a few of them are described here; for more comprehensive information, see the MMA web site.
The General MIDI (GM) and General MIDI 2 (GM2) standards define a MIDI instrument’s response to the receipt of a defined set of MIDI messages. As such, they allow a given, conformant MIDI stream to be played on any conformant instrument. Although dependent on the basic MIDI 1.0 specification, the GM and GM2 specifications are each separate from it. As such, it is not generally safe to assume that any given MIDI message stream or MIDI file is intended to drive GM-compliant or GM2-compliant MIDI instruments. General Midi 1 was introduced in 1991.
General MIDI 2
Later, companies in Japan’s Association of Musical Electronics Industry (sic) (AMEI) developed General MIDI Level 2 (GM2), incorporating aspects of the Yamaha XG and Roland GS formats, extending the instrument palette, specifying more message responses in detail, and defining new messages for custom tuning scales and more. The GM2 specs are maintained and published by the MMA and AMEI.
General MIDI 2 was introduced in 1992.
Later still, GM2 became the basis of the instrument selection mechanism in Scalable Polyphony MIDI (SP-MIDI), a MIDI variant for mobile applications where different players may have different numbers of musical voices. SP-MIDI is a component of the 3GPP mobile phone terminal multimedia architecture, starting from release 5.
GM, GM2, and SP-MIDI are also the basis for selecting player-provided instruments in several of the MMA/AMEI XMF file formats (XMF Type 0, Type 1, and Mobile XMF), which allow extending the instrument palette with custom instruments in the Downloadable Sound (DLS) formats, addressing another major GM shortcoming.
Alternate Hardware Transports
In addition to the original 31.25 kBaud current-loop, 5-pin DIN transport, transmission of MIDI streams over USB, IEEE 1394 (AKA FireWire), and Ethernet is now common. Perhaps in the long run the IETF’s RTP MIDI specification for transport of MIDI streams over Ethernet and the Internet may completely supersede the original DIN transport, since RTP MIDI is capable of providing the high-bandwidth channel that earlier alternatives to MIDI (such as ZIPI) were intended to bring. See external links below for further information.
By convention, instruments that receive MIDI generally use the conventional 12-pitch per octave equal temperament tuning system. Unfortunately this tuning system makes many types of music inaccessible because the music depends on a different intonation system. To address this issue in a standardized manner, in 1992 the MMA ratified the MIDI Tuning Standard, or MTS. This standard allows MIDI instruments that support MTS to be tuned in any way desired, through the use of a MIDI Non-Real Time System Exclusive message.
MTS uses three bytes, which can be thought of as a three-digit number base 128, to specify a pitch in logarithmic form. The following formula gives the byte values needed to encode a given frequency in Hertz:
For a note in A440 equal temperament, this formula delivers the standard MIDI note number. Any other frequencies fill the space evenly.
While support for MTS is not particularly widespread in commercial hardware instruments, it is nonetheless supported by some instruments and software, for example the free software programs TiMidity and Scala (program), as well as other microtuners.
Other applications of MIDI
MIDI is also used every day as a control protocol in applications other than music, including:
• Show control
• Theatre lighting
• Special effects
• Sound design
• Recording system synchronization
• audio processor control
• Computer networking, as demonstrated by the early first-person shooter game
MIDI Maze, 1987
• Animatronics figure control
• Animation parameter control, as demonstrated by Apple Motion v2
Such non-musical applications of MIDI are possible because any device built with a standard MIDI Out connector should in theory be able to control any other device with a MIDI In port, just as long as the developers of both devices have the same understanding about the semantic meaning of all the MIDI messages the sending device emits. This agreement can come either because both follow the published MIDI specifications, or else in the case of any non-standard functionality, because the message meanings are agreed upon by the two manufacturers.
Beyond MIDI 1.0
Although traditional MIDI connections work well for most purposes, a number of newer message protocols and hardware transports have been proposed over the years to try to take the idea to the next level. Some of the more notable efforts include:
The Open Sound Control (OSC) protocol was at CNMAT. OSC has been implemented in the well-known software synthesizer Reaktor and in other projects including SuperCollider, Pure Data, Isadora, Max/MSP, Csound, VVVV and ChucK. The Lemur Input Device, a customizable touch panel with MIDI controller-type functions, also uses OSC. OSC differs from MIDI over traditional 5-pin DIN in that it can run at broadband speeds when sent over Ethernet connections. Unfortunately few mainstream musical applications and no standalone instruments support the protocol so far, making whole-studio interoperability problematic. OSC is not owned by any private company, however it is also not maintained by any standards organization. Since September 2007, there is a proposal for a standardized namespace within OSC for communication between and controllers, synthesizers and hosts.
Yamaha has its mLAN protocol, which is a based on the IEEE 1394 transport (also known as FireWire) and carries multiple MIDI message channels and multiple audio channels. mLAN is not maintained by a standards organization as it is a proprietary protocol. mLAN is open for licensing.
This article or section contains information about scheduled or expected future events.
It may contain tentative information; the content may change as the event approaches and more information becomes available.
Development of a major modernization of MIDI is now under discussion in the MMA. Tentatively called “High-Definition MIDI” (HD-MIDI™), this new standard would support modern high-speed transports, provide greater range and/or resolution in data values, increase the number of MIDI Channels, and support the future introduction of entirely new kinds of MIDI messages. Representatives from all sizes and types of companies are involved, from the smallest specialty show control operations to the largest musical equipment manufacturers. No technical details or projected completion dates have been announced