Published in TRACEY:
Drawing and Technology
December 2010
Drawing and Visualisation Research
http://www.lboro.ac.uk/departments/ac/tracey/
tracey@lboro.ac.uk
Advanced Media Control Through Drawing:
Using a graphics tablet to control complex audio and video data in a live context
Authors
Dr. Steve Gibson, Senior Lecturer, Interactive Media Design, School of Design, Northumbria
University, Newcastle, UK. Media Artist, Electronic Musician.
Justin Love, Interdisciplinary MSc student in Visual Art and Computer Science, University of
Victoria, Canada. Programmer, Media Artist, VJ.
Keywords
Live audio, live video, electronic drawing, electronic music, performance, VJing, MIDI, Max MSP,
interface design, transmedia, synaesthesia, graphics tablets, multimedia.
Note
This article contains a number of video files. For any figure marked as a video, please click on
the link in the Figure description and the video will play in your browser. You will need
QuickTime to play the videos - http://www.apple.com/quicktime/download/. Adobe Reader 9
is required to link to the videos from the pdf file - http://www.adobe.com/products/reader/
Summary
This paper demonstrates the results of the authors’ Wacom tablet M)D) user interface. This
application enables users’ drawing actions on a graphics tablet to control audio and video
parameters in real-time. The programming affords five degrees (x, y, pressure, x tilt, y tilt) of
concurrent control for use in any audio or video software capable of receiving and processing
MIDI data. Drawing gesture can therefore form the basis of dynamic control simultaneously in
the auditory and visual realms. This creates a play of connections between parameters in both
mediums, and illustrates a direct correspondence between drawing action and media
transformation that is immediately apparent to viewers.
The paper considers the connection between drawing technique and media control both
generally and specifically, postulating that dynamic drawing in a live context creates a
performance mode not dissimilar to performing on a musical instrument or conducting with a
baton. The use of a dynamic and physical real-time media interface re-inserts body actions into
live media performance in a compelling manner. Performers can learn to draw/play the
graphics tablet as a musical and visual instrument , creating a new and uniquely idiomatic
form of electronic drawing. The paper also discusses how to practically program the application
and presents examples of its use as a media manipulation tool.
Introduction
Electronic drawing is generally (though not exclusively) limited to pictorial or representational
drawing with a pen and graphics tablet in order to produce still images or frames for
animations in software such as Adobe Flash. Given the authors’ background in media art,
physical computing and transmedia applications, we were more interested in the idea of
repurposing the graphics tablet as media control device. We were particularly interested in the
act of electronic drawing as a means of manipulating sound and video rather than as an output
to an actual drawing, whether on-screen or for print. (This concept was initially proposed by
Donna Leishman, expert Flash animator and digital artist/writer:
http://www.6amhoover.com/).
Published in TRACEY:
Drawing and Technology
December 2010
Drawing and Visualisation Research
http://www.lboro.ac.uk/departments/ac/tracey/
tracey@lboro.ac.uk
Other D tablet-like devices have been used for
manipulating sound, including the popular Kaoss Pad
(http://www.korg.com/product.aspx?&pd=278) from Korg.
These devices are generally very limited in size: the Kaoss
Mini KP has the following dimensions: 4.17 in (W) x 5.08 in (D).
This version of the Kaoss pad has a very restricted area for
media control and in the context of live performance this does
not allow for dramatic gesture.
The notion of the dramatic gesture was very important for us
since we wanted to use a tablet as the only visible
performance element (a Kaoss pad is usually employed as a
manipulation device in tandem with a synthesizer). It was
Figure 1 - Korg KAOSS PAD,
imperative that the audience be able to identify performer
http://www.korg.com/product.as gesture with perceptible results in the media used (live sound
px?&pd=278
and video in our case). To achieve this aim a larger area for
drawing would be required. Finally, another limitation of
other 3D controllers such as the KAOSS pad is that they are ONLY 3D. In order to achieve
complex control of the media elements through our drawing actions, we wanted to be able to
use more than three parameters at a time.
For the above reasons we decided to use a graphics tablet as a control device for gestural
performance, and in particular we selected the Intuos 3 from Wacom (now superseded by the
Intuos 4 - http://www.wacom.com/intuos/). The Intuos 3 is 9 inches x 12 inches in dimension
and has five degrees of control. This large surface combined with five possible control
parameters made it an ideal choice. Also the fact that one uses a pen with a graphics tablet
makes it more useful as a performance instrument, since the pen is a visible object that
audience members can identify in much the same way as they would a conductor’s baton
(though the audio-visual results we aspired to were quite different from those of an orchestra).
Background: Live music performance as a visual medium
Our interest in repurposing a device such as a graphics tablet for media control stems from the
authors’ dissatisfaction with certain modes of electronic music performance. Live music has
traditionally been experienced as a partially visual medium, with the visible actions of the
performer holding the attention of the listener: Making music involves not only the
communication of musical sounds but is also characterized by a continuously changing and
meaningful use of facial expressions, body movements, and hand gestures. Until the late
nineteenth century, music performances were almost always experienced as audio-visually
integrated activities. (Graham, Russo and Thompson 2005, p.177)
With the advent of electro-acoustic tape-based music in academic electronic music and laptop
performance in popular electronic music, the visual reference of the performer became unstuck from the resulting sound produced in a live performance scenario. The relatively long
history of tape-based electro-acoustic music has attempted to deal with the lack of performance
spectacle by ever-increasing complex diffusions of the sound in space. This often involves
ever-increasing numbers of speakers placed strategically throughout the room. For example, the
BEAST diffusion system at Birmingham University has over
discrete channels of sound
available: http://www.beast.bham.ac.uk/about/index.shtml .
2
Published in TRACEY:
Drawing and Technology
December 2010
Drawing and Visualisation Research
http://www.lboro.ac.uk/departments/ac/tracey/
tracey@lboro.ac.uk
When used effectively this can create a sort of architectural soundscape, in which the listener
follows the sound around the speakers and thereby intuits an image of sound as an object in
space, albeit virtually. For the most part though, it is hard to avoid the conclusion that the use of
ever-increasing numbers of speakers is a rather desperate ploy to obfuscate the fact that the
audience does not easily respond to music in the absence of the visual spectacle of the
performer. As experienced listeners of diffused electro-acoustic music, the authors have been
generally unmoved by the increasingly complex
attempts to counter the visual spectacle of the
performer with a large array of speakers. In
short, in the majority of cases, the tape-music
concert is one that generally does not satisfy the
audience need for performance complexity. The
genuinely live performance event has
possibilities for variation, expression, errors, and
communication with the listeners/viewers that
are lacking in purely tape-based performances.
Similarly, pure laptop-based popular electronic
music faces a similar crisis. When the performer
is hidden behind the screen, it is simply
impossible to recognize or even infer what he or
she might be doing. A relatively long-standing
joke in laptop music circles asserts that they
could as well be playing a computer game back
there. As with tape-based music, the visual
aspect in laptop performance is reduced to a
virtually-inferred spectacle at best. (It should be
said that in laptop performance there is at least
the presence of an actual performer, and though
Figure 2 - Classic laptop performer - David Stout, it may be difficult to ascertain his or her actions,
Noisefold, http://nfold.csf.edu/Pages/Bio.htm
at least he or she is generally doing something
live).
The introduction of physically expressive performance aspects into electronic music has
increased in the past ten to fifteen years. With improvements in computer speeds and the
growing number of gesturally expressive media control devices available, electronic music has
become increasingly engaging within a genuine performance context. The laptop is still
employed live in most cases, but at least it is being controlled by a performer with some other
device that the audience can relate to as an instrument. In this regard the live electronic
performance medium with a physically present and active performer bears some
resemblance to the drawing medium, in that deliberate and unconscious gestures in time form a
basis for the artist input in both mediums. In both drawing and live performance errors are
allowed (or are, at the least, quite inevitable). Meandering, testing, chance, and happy accidents
in improvisational or semi-improvisational mediums such as live electronic music or DJing, are
quite relatable to doodling, dreaming and sketching in the field of drawing.
Background: The VJ as live performer
It is worth noting that the VJ faces a similar problem to that of the laptop performer: how do his
or her actions relate to the music and visual performance and how does the audience perceive
those actions as in any way corresponding to the audio-visual results? VJing generally relies
3
Published in TRACEY:
Drawing and Technology
December 2010
Drawing and Visualisation Research
http://www.lboro.ac.uk/departments/ac/tracey/
tracey@lboro.ac.uk
heavily on digital signal processing (DSP), in which the VJ software detects aspects of the
incoming audio (e.g. hard attack beats coming from a kick drum) and consequently applies
effects, cross-fades, or clip edits based on that information. This normally creates an obvious
connection between the audio performance and the resulting visual world, at least on the
rhythmic level. The video in Figure 3 below illustrates how these DSP tools allow for a
perceptible rhythmic connection to be automatically created between the sound and visual
worlds in a live DJ/VJ performance.
While these DSP tools provide a solid automatic connection between the audio and video in a VJ
performance, there remains the presence of the VJ performer: as with laptop music
performance, in the absence of any noticeable interface beside the laptop, the performer’s
actions are somewhat mystifying to the audience.
For the above reasons we decided that the graphics tablet served as a potent tool for uniting the
DJ (or electronic musician) and the VJ under a single interface. In addition by using the large
surface of the tablet as an interaction device we created a dynamic performance instrument that
the audience could relate to in a very physical manner.
Figure 3 - Video link: http://www.telebody.ws/TRACEY/EPI_Shanghai_02_edit.mov. Exploding, Plastic
and Inevitable Redux. Digital Art Weeks, Shelter Club Shanghai, 2008. Steve Gibson - VJ. Stefan Müller
Arisona - DJ. Video shot by Ika Arisona, edited by Steve Gibson.
Background: the graphics tablet as a performance instrument
As we began playing with the graphics tablet as a media control device it became quite clear to
us that the tablet had a kind of idiomatic playing technique that worked very well in
performance. The size of the tablet, combined with the pen as a fixed reference point, allowed us
to explore it as a genuine media instrument. The audience was often transfixed by the user
actions during live performance, confirming for us (at least anecdotally) the usefulness of the
graphics tablet as a performance tool and electronic drawing as a performance medium.
4
Published in TRACEY:
Drawing and Technology
December 2010
Drawing and Visualisation Research
http://www.lboro.ac.uk/departments/ac/tracey/
tracey@lboro.ac.uk
Given that both of the authors are at best naïve drawers, we developed our own technique that
is quite distinct from traditional or electronic drawing. At one point, we did consider having the
drawing actions represented as a line in real-time, so that on one projection screen the live
drawing would be projected and on a second screen the corresponding video results would be
shown. This proved cumbersome to implement and ultimately we decided the aesthetic value of
our drawings was insignificant compared to the video effects created by drawing. It is worth
noting that we have shared our software with other artists who had a more developed
background in drawing and they represented the live drawing aspect more thoroughly. In short
they allowed the audience to draw using the tablet, and then by pressing a button on the tablet
pen the user could switch to a mode in which they could apply effects to their drawing in real
time. Simply put, the approach to drawing as a tool of media manipulation clearly has multiple
and varied applications depending on the interests and abilities of the artists.
Our general approach to the tablet as a performance instrument has its roots in both our
interest in alternative interfaces, and our lack of satisfaction with laptop performance. As
described above, there is an on-going academic, theoretical and public debate about work
consumed in a live situation in which the live element is impossible to perceive by the audience
(i.e. laptop performance) or there is literally no live element (i.e. tape-based electro-acoustic
music):
The use of computers in live performance has resulted in a situation in
which cause-and-effect has effectively disappeared, for the first time
since music began. Once we started to use computers in live
performance – to interpret abstract gestures and generate sound as a
result – the age-old relationship between gesture and result became so
blurred as to be often imperceptible. (Schloss 2002)
While this may be acceptable in certain situations (e.g. DJing background music in a bar), for
most live performance situations the audience responds most favourably to the correspondence
between performer action and the result that they see or hear. The tablet provides a fixed point
of reference for the performer and audience in the pen. The use of the drawing pen allows for
both subtle and dramatic gesture. It also connects to the performer’s hand, giving the audience a
familiar referent that they will have most likely experienced in various types of live
performance situations, whether it is the hands of the keyboardist playing the piano or the hand
of the conductor waving a baton.
Initial research
Right at the outset we decided to use the graphics tablet as a MIDI device. In essence a MIDI
device allows one to communicate data to audio and some types of video software as well as
MIDI-equipped music instruments, and the data tells the receiver to play sounds or clips and to
apply effects (for more information on the MIDI specification please see
http://www.midi.org/techspecs/index.php). Our initial research into programming options for
using MIDI with a Wacom tablet uncovered some pre-made applications, but in general we were
unsatisfied with the both the usability and look and feel of these solutions. For example, one
such application that we considered was the Tablet M)D)
(http://www.livelab.dk/tablet2midi.php) interface developed by Livelab in Denmark. This
program allows complex mapping of MIDI data to multiple parts of the graphics tablet. The
essential problem with this model is that it uses a complex menu-driven user-interface that is
not intuitive. In addition the application is Windows-only, and given the fact that the authors
and much of the digital media community are Mac users, we determined that this off-the5
Published in TRACEY:
Drawing and Technology
December 2010
Drawing and Visualisation Research
http://www.lboro.ac.uk/departments/ac/tracey/
tracey@lboro.ac.uk
shelf solution was not ultimately viable. If we wanted a solution that would meet our needs and
the needs of a more general digital media community, it would be necessary to develop our own
alternative application.
For this reason we decided to look at creating our own interface using Max from Cycling 74
(http://cycling74.com/). Max (in conjunction with MSP and Jitter) is a graphical programming
environment for music, audio and media. Programs are created in Max by connecting modular
graphical components called external objects into a network called a patch. )n addition, Max
has an application programming interface (API) that allows users to create their own external
objects. The extensibility and modularity of Max has resulted in a large community of
developers that create and share their custom Max external objects. To implement our interface
we incorporated a Max external object designed by Olaf Matthes (http://www.akustischekunst.org/maxmsp/) that outputs data produced by the Wacom Tablet. In the Max patch, the data
from the Wacom tablet is routed and mapped to a series of user-definable MIDI messages that
can then be used to produce audio and video manipulations and transformations.
Figure 4 - Wacom MIDI interface, Justin Love and Steve Gibson, 2008-09.
Interface design
It was essential that the interface that we created be entirely contained in one window, with no
need for submenus or even normal file menus. Figure 4 above illustrates our basic interface
design. Side-to-side (x), up-and-down (y), pressure (p), tilt x (tx), and tilt y (ty) can each be
mapped to any of 127 possible MIDI control change parameters on 16 different MIDI channels.
In addition, control curves can be applied to each parameter. For example, a linear control curve
produces a consistent rate of change for a given parameter, whereas an exponential control
curve causes a parameter to change slowly at first and then increase rapidly towards the end.
The MIDI output can be routed to two different MIDI ports simultaneously, thus allowing
control of MIDI data via Interapplication MIDI (to another piece of MIDI software on the same
machine) or out from the port of a MIDI interface (to another computer or to a MIDI device such
as a synthesizer). The grid at the top allows the user to save up to 96 different MIDI control
change setups in one file. Figure 5 below illustrates the function of each of the interface items.
6
Published in TRACEY:
Drawing and Technology
December 2010
Drawing and Visualisation Research
http://www.lboro.ac.uk/departments/ac/tracey/
tracey@lboro.ac.uk
Figure 5 - Interface with explanation, Justin Love and Steve Gibson, 2008-09.
Connecting with audio and video
To connect our application with other MIDI
software, we used the Interapplication MIDI
driver which comes standard on the Mac OS.
This allowed us to instantly map tablet actions
to an effect or parameter in the audio or video
software. Naturally only five tablet
parameters can be used to control video or
audio data at the same time, but it is possible
to route the same MIDI control data to
multiple functions in the video or audio
applications. For example, simultaneous
control of audio volume, audio pan, video
opacity and video x position can be mapped
with one tablet parameter (most logically x).
With the grid of 96 storable configurations,
once the user gets tired of the same
configuration a new one can be loaded
instantaneously. It is possible to cue a
configuration (similar to how a DJ would cue a
new track) and press one button to
immediately activate the new setup.
The applications we use with our tablet audiovisual interaction are Ableton Live (for audio)
and Modul8 (for video). Both of these
Figure 6 - Mapping tablet data to audio control in
Ableton Live.
programs are ideal for live audio-visual
performance as they have extensive MIDI
support and are built for non-linear performance situations in which the user may want to
change, modify or apply effects to audio-visual materials instantaneously. Figure 6 shows how
7
Published in TRACEY:
Drawing and Technology
December 2010
Drawing and Visualisation Research
http://www.lboro.ac.uk/departments/ac/tracey/
tracey@lboro.ac.uk
we mapped incoming MIDI data from the Wacom tablet to control an audio effect in Ableton
Live. Figure 7 below illustrates how the same tablet parameter is mapped to multiple effects in
Modul8.
Figure 7 - Mapping tablet data to video control in Modul8.
Using the tablet data via Interapplication MIDI allows for simultaneous control of multiple audio
and video parameters, though in Figures 6 and 7 we control one audio parameter with one
tablet parameter (tilt y) and three video parameters with the same data. Naturally controls from
the remaining four tablet parameters can be assigned to Modul8 and Ableton Live parameters
allowing the user to create a constant dance of tablet drawings to audio-visual effects.
The tablet as a live performance device
With the possibility of seemingly endless
mappings of tablet data to corresponding
audio and video effects we were acutely
aware that some sort of logical mapping
structure would have to be established in
order to allow the interface to be used
effectively in a performance situation. This
harkens back to the concern that we had at
the outset: we wanted to be sure that tablet
actions had observable results so that
audience members would be able to clearly
identify the relationship between the live
drawing and the performed audio-visual
elements.
Figure 8 – Steve Gibson Virtual DJ. Live at Incubation
Conference, July 2005, Stealth Attack, Nottingham.
Photo by Jonathan Griffiths.
In the past Steve Gibson has worked on
several projects which explore 3D
interfaces and the connections between
8
Published in TRACEY:
Drawing and Technology
December 2010
Drawing and Visualisation Research
http://www.lboro.ac.uk/departments/ac/tracey/
tracey@lboro.ac.uk
audio and video mappings and 3D control. The most important of these projects was Virtual DJ
(http://www.telebody.ws/VirtualDJ/), in which the user is given control over an audio
environment and lighting effects by moving in 3D space. In short this project used standard
control configurations between user actions and results in the audio-visual system. For
example, raising the hands would generally play an upwards melody and with each new note a
light would change colour. This allowed the user to identify that their motions were having
observable and repeatable effects in both the audio and visual realms. In essence this simulated
the effects of synaesthesia, a condition in which persons can often see colours in response to
particular sounds, tones or musical notes: Synesthesia is an involuntary joining in which the
real information of one sense is accompanied by a perception in another sense. (Cytowic 1989,
p.1)
In order to achieve the illusion of synaesthesia in Virtual DJ a series of relations was built up
between user movements in 3D space and simultaneous light and sound changes. While these
were not held to strictly, there was enough constancy in their application that users were able
to navigate the audio-visual world with very little difficulty.
For our tablet interactions we thought of the tablet interface as a miniaturised version of the
multi-dimensional spatial interface used in Virtual DJ. Thus logical interactions could be
inferred by testing user actions with system results. For instance, drawing on the y-plane on the
tablet (up-down) could logically map to audio volume or low-pass filter (which would have a
similar effect to volume, without completely removing the sound at the bottom of the tablet)
and image opacity. Therefore we built a limited series of controls to be employed by the user on
the tablet and held to these controls throughout our performance. The videos in Figures 9 and
10 below show how three of the tablet parameters (x, y and p) were mapped to audio effects.
In developing our tablet performance we proceeded in a rather intuitive manner when it came
to mapping tablet data to audio and video effects. It soon became clear that certain effects
worked more effectively than others. In addition it became clear that the needs of audio balance
and visual clarity had to be considered in equal measure to the fluidity of the tablet interaction.
Video compositing had to be carefully considered in order to achieve clarity in both the
interaction and the video domain. Too many similar-looking video files would produce an
unseemly mass of incongruous visual objects that the performer and viewers could not
distinguish between. Therefore for the most part we used similar types of video and audio files
mapped to the same tablet parameter. The x plane for example was linked to drum-based audio
and text-based video almost exclusively for the final 45-minute performance. The y plane was
linked to bass sounds and much more abstract video. This had the effect of making the
interaction always logical, if not entirely predictable (given that both the drum and bass sounds,
and the text-based and abstract videos changed throughout).
9
Published in TRACEY:
Drawing and Technology
December 2010
Drawing and Visualisation Research
http://www.lboro.ac.uk/departments/ac/tracey/
tracey@lboro.ac.uk
Figure 9 – Video link: http://www.telebody.ws/TRACEY/Tablet_demo1.mov. Video, Steve Gibson, 2010.
Figure 10 – Video link. http://www.telebody.ws/TRACEY/Tablet_demo2.mov. Video, Steve Gibson, 2010.
Conclusions
In summation we have found that the Wacom graphics tablet is a powerful device for
controlling live audio and video in a performance situation. The act of live drawing, though here
removed from its traditional reference to a produced drawing either on-screen or in print), is
one that allows for dramatic gesture in a way that pressing keys on a computer keyboard or
moving a mouse could never hope to achieve. In addition the fact that the graphics tablet can
unite five degrees of control over live audio and video makes it an ideal tool to consolidate the
roles of the DJ and the VJ under one control interface. Finally we present a live example of the
10
Published in TRACEY:
Drawing and Technology
December 2010
Drawing and Visualisation Research
http://www.lboro.ac.uk/departments/ac/tracey/
tracey@lboro.ac.uk
tablet used in a performance situation with one of the authors controlling live audio and video
with his drawing actions (Figure 11 below). Please see Appendix 1 below for information on
how to download and use the Wacom MIDI software.
Figure 11 – Video Link: http://www.telebody.ws/TRACEY/Tablet_Demo_Split.mov. Wacom Tablet MIDI
Demo. Live performance by Steve Gibson, January 2010. Video by Steve Gibson, 2010.
11
Published in TRACEY:
Drawing and Technology
December 2010
Drawing and Visualisation Research
http://www.lboro.ac.uk/departments/ac/tracey/
tracey@lboro.ac.uk
Bibliography
Cytowic, R., 1989. Synesthesia: A Union of the Senses. New York: Springer-Verlag.
Faulkner, M. and D-Fuse, eds., 2006. Audio-Visual Art + VJ Culture. London: Lawrence King.
Graham, P., Russo, F.A. and Thompson, W.F., 2005. Seeing music performance: Visual influences
on perception and experience. Semiotica, 156–1/4.
Schloss, W.A., 2002. Using Contemporary Technology in Live Performance:
The Dilemma of the Performer. Journal of New Music Research, Vol. 31, No. 1.
Web Resources and Examples
Ableton Live website: http://www.ableton.com/
Adobe Reader website - http://www.adobe.com/products/reader/
Birmingham ElectroAcoustic Sound Theatre website: http://www.beast.bham.ac.uk
Cycling 74 Max MSP website: http://cycling74.com/
Garagecube Modul8 website: http://www.garagecube.com/modul8/index.php
Steve Gibson’s Virtual DJ website: http://www.telebody.ws/VirtualDJ/
Korg Kaoss Pad website: http://www.korg.com/product.aspx?&pd=278
Donna Leishman’s 6amhoover website: http://www.6amhoover.com/
Live Lab Tablet 2 MIDI website: http://www.livelab.dk/tablet2midi.php
MIDI Manufacturers Association website - http://www.midi.org/techspecs/index.php
Noisefold website - http://nfold.csf.edu/Pages/Noisefold.htm
Olaf Matthes’ external objects for Max/MSP - http://www.akustische-kunst.org/maxmsp/
Qucktime website – http://www.apple.com/quicktime/download/
Rogue Science Wacom MIDI software: http://www.roguescience.org/wacom2MIDI.zip
Wacom Intuos website: http://www.wacom.com/intuos/
12
Published in TRACEY:
Drawing and Technology
December 2010
Drawing and Visualisation Research
http://www.lboro.ac.uk/departments/ac/tracey/
tracey@lboro.ac.uk
Appendix 1: Downloading and using the Wacom MIDI software
The Wacom MIDI software can be downloaded at:
http://www.roguescience.org/wacom2MIDI.zip.
At present the software is Macintosh native and works only with Wacom tablets. It will function
with many different models of the Wacom tablet. To use the Wacom MIDI software you will
need a basic knowledge of MIDI and access to MIDI software. Trial versions of both Ableton Live
(http://www.ableton.com/downloads) and Modul8 (http://www.garagecube.com/modul8) can
be downloaded free of charge. You will also need to make sure the IAC driver is activated in
Applications/Utilities/Audio MIDI Setup as follows.
Figure 12 - Using the IAC MIDI driver in Audio MIDI Setup
If you are using Ableton Live you will also need to ensure that Live is receiving MIDI data from
the IAC driver. Under the Live menu/Preferences please set your MIDI to the following:
Figure 13 - Activating the IAC MIDI driver in Ableton Live
13
Published in TRACEY:
Drawing and Technology
December 2010
Drawing and Visualisation Research
http://www.lboro.ac.uk/departments/ac/tracey/
tracey@lboro.ac.uk
The first video (Figure 14) below demonstrates how to create a setup for controlling MIDI
software with the tablet. The second video (Figure 15) shows how to load your saved setup and
apply curves to the different tablet parameters.
If you use the software in a project please credit the authors, Steve Gibson and Justin Love.
Figure 14 – Video link: http://www.telebody.ws/TRACEY/Wacom_MIDI_Setup_Demo.mov. How to setup
the Wacom MIDI software to use with your Wacom tablet. Video by Steve Gibson, 2010.
Figure 15 – Video link: http://www.telebody.ws/TRACEY/Wacom_MIDI_Load_Demo.mov. How to load
your saved setup. Video by Steve Gibson, 2010.
14