05 - Midi
05 - Midi
05 - Midi
The Basics
2
cables should be used to transmit MIDI data. MIDI has as many software
implementations as it has hardware ones.
At its most basic level, MIDI was conceived as a way to link 2 or more
synthesizers together to layer sounds. It now is used to control studio
equipment, light shows, and even control factory automation.
Brief History of MIDI
As it turned out it was a false dawn. The synthesizers of the 1970s might
have been unrestricted sonically but in terms of playability, stability, polyphony,
and compatibility they were still very limited indeed.
Early integrated circuits-based synthesizers from Moog, ARP, and EMS
opened the door but it was the arrival of Japanese companies like Korg, Roland,
and Yamaha in the mid 1970s that converted potential into popularity.
Digitally Controlled Synthesizers
3
the time (around $4000). Soon Korg, Roland, and Yamaha's microprocessorbased offerings would slash prices in half, and by the turn of the decade the
polyphonic synthesizer was firmly on the map for every self-respecting keyboard
player from hobbyists to touring professionals. The days of the Hammond organ,
the Fender Rhodes piano, and the Hohner Clavinet were coming to an end or
so we thought
Stability, playability, and polyphony continued to evolve in the early
1980s but compatibility remained a thorn in the side of manufacturers.
MIDI concept is born
4
standardizing a basic hardware/software interface for data exchange among
different machines. It is no exaggeration to say that MIDI fueled an incredibly
active period of hardware synthesis development during the late 1980s and
early 1990s.
MIDI and the Personal Computer
Now that computers could speak MIDI, many new software companies started
creating programs for sequencing. Some of the first.
One of the advantages of MIDIs modular concept is that you could now
pick and choose system components that best suit your needs. Your
favorite keyboard could be linked to any MIDI instrument you please. To
add additional synthesizers, you dont necessarily need more keyboards.
The newly developed concept of the synthesizer module would save space
and money.
What is MIDI?
The Musical Instrument Digital Interface (MIDI) allow musicians, sound and
lighting engineers, computer enthusiasts, or anybody else for that matter to use
multimedia computers and electronic musical instruments to create, listen to, and
learn about music by offering a common language that is shared between
compatible devices and software.
MIDI can be divided into 3 categories.
1.
2.
3.
The File Formats how MIDI manages and manipulates the data.
Standard MIDI Files (SMF) Music Editing - Sequencing
1.
Protocol
The Protocol of MIDI is a music description language in binary form, in
which binary words describe an event of a musical performance.
All Status bytes begin with a 1 as the MSB and are the first byte
transmitted when sending any MIDI command. They serve to identify the
kind of information being sent over MIDI. It tells the receiving device which
MIDI channel the event belongs to and what the event is. For example, an
event can be a Note On, Note Off, Pitch bend, program change, etc
All Data bytes begin with a 0 as the MSB and usually 1 or 2 data bytes
follow a status byte. They represent some value associated with the status
byte. For example, when you strike a middle C on the transmitting keyboard
with fairly heavy force, the Status message would hold a note on and the
Data message would be a note value number of 60d and a velocity level of
maybe about 114d.
Note On command
7
determines if it is a Status or Data message.
MIDI actually uses a 10-bit word when it transmits. 8-bits are for
information (Status or Data) and the 2 extra bits are used for error
correction.
Since the MSB is only used to designate whether the message is Status or
Data, this only leaves 7-bits of values for the rest of the word. The values of
7-bits range from 0-127d. This is why you will notice that values in MIDI are
often numbered from 0 to 127. When you adjust the controls on some MIDI
instruments you will see values range from 0 to 127.
8
musical performance. The most common Channel Voice
Messages are Note On and Note Off, Pitch Bend, Program Change
and After Touch.
9
MIDI channel): the key or note number and the velocity at which the note was
played. Both of these are data bytes and their values can be between 0 and
127. Below are charts for MIDI note numbers and velocity range.
MIDI NOTE Numbers
Octave
-1
0
1
2
3
4
5
6
7
8
9
C
0
12
24
36
48
60
72
84
96
108
120
C#/Db
1
13
25
37
49
61
73
85
97
109
121
D
2
14
26
38
50
62
74
86
98
110
122
D#/Eb
3
15
27
39
51
63
75
87
99
111
123
E
4
16
28
40
52
64
76
88
100
112
124
F
5
17
29
41
53
65
77
89
101
113
125
F#/Gb
6
18
30
42
54
66
78
90
102
114
126
G
7
19
31
43
55
67
79
91
103
115
127
G#/Ab
8
20
32
44
56
68
80
92
104
116
-
A
9
21
33
45
57
69
81
93
104
117
-
A#/Bb
10
22
34
46
58
70
82
94
106
118
-
B
11
23
35
47
59
71
83
95
107
119
-
Musical Notation
ppp
pp
p
mp
mf
f
ff
fff
10
Channel Aftertouch (D0-DF 1101cccc) Like Polyphonic aftertouch except
it takes the value of the key that is being pressed the hardest and sends that
data out. All keys that are being played will respond to the data that is being
sent out by the key with the most pressure. The result is less data being
transmitted and all keys that are being played will have the same amount of
aftertouch modulation applied to them.
Pitch Wheel (E0-EF 1110cccc) Pitch bending can make the sound more
expressive, somewhat like a wind instrument. Because the human ear is very
sensitive to pitch changes, the pitch bend message contains two data bytes to
determine the bend value (214). This gives a resolution of 16,384 steps, which
is usually split in 2, with +8,192 steps above and -8,192 steps below with 0
being the original pitch. This smoothes out the stair-stepping effect that
would result with only 128 steps. Pitch bend control is a Continuous
Controller type since it continuously sends MIDI messages to update the
receiving MIDI device on the position of its controller.
Control Change and Channel Mode (B0-BF 1011cccc) The Keys are not
the only way to control the sound. There are 128 control changes that are
defined in the 2nd Data byte. Devices such as a modulation wheel, sustain
pedal, volume control, expression pedal, breath controller and many more are
used to give you more control over the expressive elements of a sound. Some
are Continuous Controller type and some are Switch Controller type.
Included under this status byte are the Channel Mode messages.
Channel Mode Messages
The Local control on/off is under this status byte. It enables/disables the
keyboard from transmitting MIDI data to its own internal sounds. When a
keyboard is connected to a DAWs sequencer, the MIDI data is received and
transmitted back out to the sending device. This can double the MIDI data in
the keyboard since it is receiving from both the DAW and its own keyboard.
By turning off local control, the device only receives from the DAW.
Channel Mode: This Status byte also directs us to the Channel Mode
functions. This determines how a device will respond to MIDI messages in
and out. There are 4 modes of operation that are combined in four ways.
Omni on mode implies that a device can respond to all or any
incoming MIDI channel data, regardless of its channel.
Omni off mode implies that a device can only respond to its base
MIDI channel. For instance, if you set your keyboard to channel 1, it
will only receive data that is on channel 1, and transmit data on
channel 1.
Poly mode implies that a device is capable of polyphony and will
enable polyphonic playing of any MIDI channel.
11
Mono mode implies that a device will not play more than one note
at a time on any given channel.
Mode 1 Omni On/Poly rarely used since data on any MIDI channel
will be played back on the devices base channel.
Mode 2 Omni On/Mono rarely used same as Mode 1 except it
will only play back monophonically (1 note at a time).
Mode 3 Omni Off/Poly Most used in todays MIDI world. The
device will respond to MIDI channel data and play back with
polyphony.
Mode 4 Omni Off/Mono rarely used same as above but will
playback in mono.
System Common, System Real Time and System Exclusive Messages
System Common (F0-FF 1111nnnn) System Common messages are
intended for all MIDI channels in a system so the last 4bits define message
types, not MIDI channels.
The System Real Time and System Exclusive messages are included in this
Status byte. Most system common messages relate to synchronization
features and are used with sequencers since they relate to time positioning,
song selection, and features on your MIDI device. Heres a look at these
messages from the last 4 bits.
F0 11110000 System Exclusive message status byte
F1 11110001 MIDI Time Code Qtr. Frame status byte
F2 11110010 Song Position Pointer status byte
F3 11110011 Song Select (song#)
F4 11110100 Undefined
F5 11110101 Undefined
F6 11110110 Tune Request
F7 11110111 End of SysEx (EOX)
F8 11111000 Timing Clock
F9 11111001 Undefined
FA 11111010 Start
FB 11111011 Continue
FC 11111100 Stop
FD 11111101 Undefined
FE 11111110 Active Sensing
FF 11111111 System Reset
System Exclusive messages address devices by manufacturer. This allows
you to send functions that are only related to that particular device and are
not common MIDI messages. For example, custom patches for a
particular synthesizer (brand and model) can be saved in SysEx at the
beginning of a sequence and loaded back into that same brand and model
12
of synth at another studio. Another example, you could send all your
parameter settings of a MIDI device into patch editing software
(editor/librarian) in order to use the computers GUI to make changes to
these parameters, rather than using the devices front panel LCD.
Each manufacturer has their own SysEx ID number that has been assigned
to them by the MMA (MIDI Manufacturers Association).
This ends the Protocol section of MIDI and believe it or not, this is just an overview,
barley scratching the surface of the MIDI protocol. Now well talk about the hardware.
2.
Hardware Interface
The cables are a twisted pair with a shield for noise rejection. The shield
is only grounded on one side so as not to create a noisy ground loop
between instruments.
Only pins 4 and 5 carry data. Pin 2 is the shield and is grounded only to
the MIDI out connection of a unit. Pins 1 and 3 are not used at this time
for MIDI 1.0 spec but may be used at a later date if there is a major
revision to the MIDI spec.
13
At its most basic level, MIDI lets the user tie in one synthesizer
with another so that both can be played from the same keyboard. One is
the transmitter, or master, generating information that is understood by the
second synth, the receiver, or slave.
Synth A - Master
Synth B Slave
For instance, when you play Synth As keyboard, the sound of Synth B can be
layered along with Synth A. But when you play Synth Bs keyboard, you will
only hear Synth B.
14
Daisy-Chain Network
The MIDI Thru connector receives a copy of any digital message coming
into the MIDI In connection and sends a duplicate of this information out of
the MIDI Thru port into the MIDI In of a third MIDI device. This allows the
user to have more than two MIDI devices connected at once. The MIDI Out
port from the second or third device in the diagram below would not work
because it is sending MIDI information from that particular synthesizer. The
MIDI Thru port is receiving the MIDI In information and passing it on to the
next device.
When MIDI devices are linked together by a series of MIDI In and MIDI Thru
connections, it is referred to as a Daisy-Chain Network.
15
In the next example we have added a computer and MIDI interface. The
first order of business is to connect the master keyboard to the computer so
they can communicate with each other.
Next connect the three tone generators (synthesizers without keyboards)
The MIDI Out on the MIDI interface may also act as a MIDI Thru that relays
a copy of the MIDI In information. This will allow the keyboard to
communicate with the computer and the three tone generators. Use the
concept of the daisy-chain network set-up from the MIDI Thru port of the
keyboard.
16
One of MIDIs limitations is that daisy-chaining becomes impractical with more
than four instruments. MIDI transmission is pretty fast, 31.25kBaud (31,250 bits
per second), but because it is transmitting in a serial fashion (1 bit at a time)
instead of a parallel fashion (1 byte at a time), it can get bogged down. After 4
connections, a perceptible time delay can occur. To remedy this effect, a star
network is used.
17
7 ms is roughly equal to the maximum separation between members of a string
quartet. In practice, latency of 10 ms is generally imperceptible, as long as the
variation (due to bottleneck of MIDI data) in the latency is kept small.
With a data transmission rate of 31.25 kBaud, and 10 bits transmitted per byte
of MIDI data, a 3-byte Note On or Note Off message takes about 1 ms
(960microsec) to be sent. Since MIDI data is transmitted serially, a pair of
musical events which originally occurred at the same time but must be sent one at
a time in the MIDI data stream and cannot be reproduced at exactly the same
time. Luckily, human performers almost never play two notes at exactly the same
time. Notes are generally spaced at least slightly apart. This allows MIDI to
reproduce a solo musical part with quite reasonable rhythmic accuracy.
However, MIDI data being sent from a sequencer can include a number of
different parts. On a given beat, there may be a large number of musical events
that should occur virtually simultaneously - especially if the events have been
quantized. In this situation, many events will have to wait their turn to be
transmitted over MIDI. Worse, different events will be delayed by different
amounts of time (depending on how many events are queued up ahead of a given
event). This can produce a kind of progressive rhythmic smearing that may be
quite noticeable. A technique called running status is provided to help reduce
this rhythmic smearing effect by reducing the amount of data actually
transmitted in the MIDI data stream.
Running status is based on the fact that it is very common for a string of
consecutive messages to be of the same message type. For instance, when a chord
is played on a keyboard, ten successive Note On messages may be generated,
followed by ten Note Off messages. When running status is used, a status byte is
sent for a message only when the message is not of the same type as the last
message sent on the same Channel. The status byte for subsequent messages of
the same type may be omitted (only the data bytes are sent for these subsequent
messages).
The effectiveness of running status can be enhanced by sending Note On
messages with a velocity of zero in place of Note Off messages. In this case, long
strings of Note On messages will often occur. Changes in some of the MIDI
controllers or movement of the pitch bend wheel on a musical instrument can
produce a staggering number of MIDI Channel voice messages, and running
status can also help a great deal in these instances.
So most modern MIDI hardware (e.g. synths) and software use 'running-status'.
It's assumed that, once the expected number of data-bytes has been sent/received,
IF the next byte is *not* a status-byte, THEN the last status-byte received should
be used to decipher the following data bytes. Typically this results in about a 1/3
reduction in the number of bytes sent.
18
3.
Before we go further with file formats, this would be a good time to talk about an
addendum to the MIDI protocol that revolutionized the MIDI industry in the early 90s.
During this time, desktop musicians, multimedia producers, and game developers began
clamoring for some level of playback predictability during the exchange of Standard
MIDI Files (SMFs). Understandably, composers and arrangers wanted to ensure that
piano parts would be played with piano patches and drums wouldn't sound like violins.
General MIDI 1
In the mid 80s, Roland produced a sound module called the MT-32. While its sound
quality was less than stellar, the MT-32 filled a need for an inexpensive tone module that
19
could be MIDIed up to a computer to play back sequences for trade show presentations
and video games. Roland repackaged the MT on a PC card called the LAPC-1 and sold
quite a few. This helped to spawn the idea of GM.
In the late 80s, MIDI manufacturers saw that there was a huge, untapped market out
there at the consumer level. Computer games were becoming very popular and there was
a need to standardize music and sound effects. Business multi-media presentations and
amateur musicians using MIDI software to compose music all needed a way to
standardize MIDI music so that it would sound practically the same on any instrument
that it was played back on. This led to the idea of General MIDI.
The first GM module was the Roland SC-55 Sound Canvas. It did very well so other
manufacturers all started producing GM compatible equipment. It is now the standard for
all computer sound cards and most high-end synths have at least a bank of GM sound in
their factory presets.
General MIDI or GM is a specification for synthesizers that imposes several
requirements beyond the MIDI standard. While MIDI itself provides a protocol which
ensures that different instruments can interoperate at a fundamental level (e.g. that
pressing keys on a MIDI keyboard will cause an attached MIDI sound module to play
musical notes), General MIDI (or GM) goes further in two ways: it requires that all GMcompatible instruments meet a certain minimal set of features, such as being able to play
at least 24 notes simultaneously (polyphony), and it attaches certain interpretations to
many parameters and control messages which were left unspecified in MIDI, such as
defining instrument sounds for each of 128 program numbers.
General MIDI was first standardized in 1991 by the MIDI Manufacturers Association
(MMA) and the Japan MIDI Standards Committee (JMSC), and has since been adopted
as an addendum to the main MIDI standard.
To be GM1 compatible, a GM1 sound-generating device (keyboard, sound module,
sound card, software program or other product) must meet the General MIDI System
Level 1 performance requirements outlined below, instantaneously upon demand, and
without additional modification or adjustment/configuration by the user.
Voices: A minimum of either 24 fully dynamically allocated voices are available
simultaneously for both melodic and percussive sounds, or 16 dynamically
allocated voices are available for melody plus 8 for percussion. All voices
respond to velocity.
Channels: All 16 MIDI Channels are supported. Each Channel can play a
variable number of voices (polyphony). Each Channel can play a different
instrument (sound/patch/timbre). Key-based percussion is always on MIDI
Channel 10.
Instruments: A minimum of 16 simultaneous and different timbres playing
various instruments. A minimum of 128 preset instruments (MIDI program
numbers) conforming to the GM1 Instrument Patch Map and 47 percussion
20
sounds that conform to the GM1 Percussion Key Map.
Channel Messages: Support for continuous controllers 1, 7, 10, 11, 64, 121 and
123; RPN #s 0, 1, 2; Channel Pressure, Pitch Bend.
Other Messages: Respond to the data entry controller and the RPNs (Registered
Parameter Number) for fine and course tuning and pitch bend range, as well as all
General MIDI Level 1 System Messages.
General MIDI is limited to the quality of each sound source that it is played back
on. Cheaper sound cards may conform to the GM standard but the quality might
be drastically inferior to a higher end product.
The advent of GM spawned a whole new market of music programmers that
would record MIDI sequences of popular music in GM format and sell them to
the amateur, semi/pro and pro musicians for use in live performance.
General MIDI 2
General MIDI 2 was adopted in 1999 and added some significant improvements
to the GM1 standard.
32 note polyphony
MIDI ch10 and 11 can simultaneously play percussion sounds.
256 program sounds. Basically, its just more variations of the original 128.
GM2's introduction of Key-Based Instrument Controllers is a major step forward
in drum programming. Key-Based lets you change the sound, pan and volume of
individual drums instead of just the whole kit.
21
system RAM, allowing MIDI music to be freely augmented with new instrument
sounds, dialog or special effects - thus providing a universal interactive playback
experience, along with an unlimited palette of sounds. At the same time, it enables the
wavetable synthesizers in computer sound cards to deliver improved audio at no
additional cost.
DLS enables the author to completely define an instrument by combining a
recorded waveform with articulation information (Attack transients). An instrument
defined this way can be downloaded into any hardware device that supports the
standard and then played like any standard MIDI synthesizer. Together with MIDI, it
delivers a common playback experience, unlike GM, an unlimited sound palette for
both instruments and sound effects and true audio interactivity, unlike digital audio.
XMF Extensible Music Format
XMF is a family of music-related file formats created and administered by the
MIDI Manufacturer's Association in conjunction with Beatnik. XMF is based on the
idea of containing one or more existing files such as Standard MIDI Files, DLS
instrument files, WAV or other digital audio files, etc. to create a collection of all
the resources needed to present a musical piece, an interactive web page soundtrack,
or any other piece of media using pre-produced sound elements. This file format is
actually a meta format a container file that points to other types of files. It loads up
the GM SMF and opens the XMF player (Quicktime, Window Media Player) while
downloading any digital audio. As soon as it has enough information to start
playback, it begins to play the content. In the case of XMF files, the content is
usually a MIDI file with its associated files: DLS and other digital audio.