Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Proceedings of the International Conference on New Interfaces for Musical Expression, 30 May - 1 June 2011, Oslo, Norway MTCF: A framework for designing and coding musical tabletop applications directly in Pure Data Carles F. Julià Daniel Gallardo Sergi Jordà Universitat Pompeu Fabra Universitat Pompeu Fabra Universitat Pompeu Fabra 138 Roc Boronat 138 Roc Boronat 138 Roc Boronat Barcelona, Spain Barcelona, Spain Barcelona, Spain carles.fernandez@upf.edu daniel.gallardo@upf.edu sergi.jorda@upf.edu ABSTRACT This growing tabletop popularity, clearly in the musical In the past decade we have seen a growing presence of table- domain but also in other fields, has increased the publicly top systems applied to music, lately with even some prod- available information for the rapid development and proto- ucts becoming commercially available and being used by typing of these types of interfaces. Online communities of professional musicians in concerts. The development of this DIY builders such as the NUIGroup2 collect large knowl- type of applications requires several demanding technical edge bases of resources and many easy-to-follow tutorials expertises such as input processing, graphical design, real are publicly available [12]. The development of this type of time sound generation or interaction design, and because of hardware solutions has indeed become easier and affordable this complexity they are usually developed by a multidisci- than ever, allowing practically anyone to experiment with plinary group. tabletop computing. In this paper we present the Musical Tabletop Coding From the software side, several well-known open-source Framework (MTCF) a framework for designing and coding solutions do also exist, both for the tracking of multi-touch musical tabletop applications by using the graphical pro- fingers, such as the NUIGroup’s Community Core Vision3 , gramming language for digital sound processing Pure Data or for the combined tracking of fingers and objects tagged (Pd). With this framework we try to simplify the creation with fiducial markers, such as reacTIVision [1]. These and process of such type of interfaces, by removing the need of other existing software tools greatly simplify the program- any programming skills other than those of Pd. ming of the input component, essential for this type of in- terfaces, but this solves only one part of the problem. The visual feedback or the graphical user interfaces, which do Keywords often also include problems specific to tabletop computing, Pure Data, tabletop, tangible, framework such as aligning the projector output with the camera input or correcting the distortion that results from the use of mir- rors, still have to be manually programmed. Not to mention 1. INTRODUCTION the underlying musical engine, our main reason after all for In the past decade we have seen a proliferation of musical developing this type of applications. tabletops. Currently, so many ”tangible musical tables” are Taking into account these considerations, it may be diffi- being developed that it becomes difficult to track every new cult to acquire the required skills for being capable of pro- proposal1 . gramming the visual interface and the audio component, Independently of the relevant differences that can exist even to find a single programming language or framework between these systems, scholars tend to agree in the bene- supporting well these two components. fits of interacting with large-scale tangible and multi-touch A simple solution to this last problem, as presented in devices. Their vast screens make them excellent candidates previous papers such as [4][3], is to divide the project into for collaborative interaction and shared control [2][4], while two different applications: one focused on the visual feed- favoring at the same time, real-time, multidimensional as back and another focused on the audio and music process- well as explorative interaction, which makes them especially ing. However, dividing the tasks will not eliminate the need suited for both novice and expert users [6]. This last author for programming still on both sides. The system we present also states that the visual feedback possibilities of this type here has been designed for simplifying these technical diffi- of interfaces, makes them ideal for understanding and mon- culties. itoring complex mechanisms, such as the several simultane- ous musical processes that can take place in an interactive 2. MUSICAL TABLETOP CODING FRAME- digital system for music performance [5]. WORK 1 Musical Tabletop Coding Framework(MTCF) is an open Kaltenbrunner, has a website devoted to Tangible Music, which includes a quite exhaustive list of devices: http:// source framework for the creation of musical tabletop ap- modin.yuri.at/tangibles/ plications that takes a step forward in simplifying the cre- ation of tangible tabletop musical and audio applications, by allowing developers to focus mainly on the audio and Permission to make digital or hard copies of all or part of this work for music programming and on designing the interaction at a personal or classroom use is granted without fee provided that copies are conceptual level (because all the interface implementation not made or distributed for profit or commercial advantage and that copies will be done automatically). bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific MTCF provides a standalone program for the visual in- permission and/or a fee. 2 NIME’11, 30 May–1 June 2011, Oslo, Norway. http://nuigroup.com 3 Copyright remains with the author(s). http://ccv.nuigroup.com/ 457 Proceedings of the International Conference on New Interfaces for Musical Expression, 30 May - 1 June 2011, Oslo, Norway terface and the gesture recognition, which communicates it does not permit all of ofxTableGestures’ functionalities, directly with Pd[11], and which enables programmers to it simplifies enormously the programming tasks by putting define the objects and their control parameters, as well as everything on the Pd side. Although no understanding of the potential relations and interactions between different how ofxTableGestures works is needed for fully exploiting objects, by simply instantiating a series of Pd abstractions. MTCF potential, next we will describe some of the basic MTCF can be freely downloaded at github4 . ofxTableGestures features in order to give a clearer idea of the whole architecture. 2.1 Description of the system ofxTableGestures is itself divided in two parts: TUIO in- MTCF has been designed for being used in conjunction with put and graphics output. ofxTableGestures’ TUIO input any type of tabletop surface that supports both the detec- part processes the messages that arrive to the framework tion of marked tangible objects and multitouch interaction, from any TUIO-compliant application (e.g. reacTIVision). although it does not force both interaction modes. The only Once these messages are processed, this component detects restriction for the hardware is the output protocol used, its and generates gestural events for the top-level programmer. tracking system should complie with the TUIO protocol [9]. ofxTableGestures’s graphics part on its side, helps to create Otherwise it does not impose either any restriction on the drawable objects while applying the distortion correction to size or shape of the surface, allowing to design for rectangu- everything that is drawn. ofxTableGestures also includes a lar surfaces as well as for circular ones such as the Reactable. self-contained tabletop simulator, which simulates figures Our internal test hardware is the one used for the re- and multiple fingers interaction, allowing testing the appli- actable [7] and reacTIVision[8] as the tracking software (see cations without the need of a real table. (see Fig. 2). When Fig. 1). The generated data from reacTIVision (i.e. posi- the simulator is enabled, a right panel with a subset of fig- tion and orientation of all the tagged pucks and fingers) ures is shown on one side of the screen. These figures are is sent to MTCF using the TUIO protocol. MTCF just labelled with the identifier that will be reported by YUIO monitors all the incoming TUIO messages and sends them messages to the system. In order to maintain the fidelity filtered to Pd by means of the Open Sound Control (OSC) between the physical table and the simulator, the figures protocol[13]. From Pd, control messages and waveform data used on the simulator match in size and shape with the real are also transmitted back to MTCF, that is in charge of per- ones in our setup. ofxTableGestures includes six different manently refreshing the visual display. figure shapes (circle, square, star, rounded square, pentagon and dodecahedron), which are defined in a configuration file multitouch control that includes the figure shape, the figure identifier and the tangibles tagged with fiducials figure colour. (visual feedback) ca r to m ec er a oj diffuse infrared pr illumination video video MTCF reacTIVision TUIO OSC sound Pure Data Figure 1: System diagram. 2.2 MTCF: Dealing with the Input data and with the GUI MTCF is itself implemented on top of openFrameworks5 (OF), a group of multi-platform libraries written in C++, Figure 2: Simulator screen shoot. specially designed for assisting creative applications pro- gramming. MTCF receives data from the TUIO application, pro- MTCF also uses an external OF add-on, ofxTableGestu- cesses it, displays the graphic feedback and sends the fil- res, which we had previously implemented with the aim of tered data to Pd via OSCMessages. At this stage, MTCF assisting multi-purpose (i.e. not necessarily musical) table- only draws the figure shapes and the fingers’ visual feed- top application design. ofxTableGestures does already solve back, all in their correct positions. The remaining graphics some of the typical problems that appear in the develop- (such as the waveforms and the relations between the fig- ment of generic tabletop applications, such as dealing with ures) are drawn in a second step, according to the additional the tracking incoming messages or correcting the graphical information that is send back via OSC messages from Pd output distortion or alignment. But ofxTableGestures is to MTCF. This will be addressed in the next section. meant for OF programmers, which means that for using it, By default, MTCF pucks only convey three basic param- programming in C++ is still needed. In that sense, MTCF, eters: X position, Y position and rotary angle. Additional built on its turn on top of ofxTableGestures, can be seen as a parameters can be enabled from Pd for any specific object. specialised and simplified subset of ofxTableGestures: while This optional additional information includes parameters 4 resulting from the relations between pairs of pucks (dis- https://github.com/chaosct/ Musical-Tabletop-Coding-Framework/downloads tance and angle between them) as well as parameters re- 5 http://www.openframeworks.cc/ sulting from the finger interaction onto given pucks, which 458 Proceedings of the International Conference on New Interfaces for Musical Expression, 30 May - 1 June 2011, Oslo, Norway can have two extra widgets (Object bar and finger slider) rotary parameter values. that can be activated from Pd, as shown in Fig. 3. These Inspired by the Reactable paradigm, which allows the cre- parameters are displayed as two semicircular lines surround- ation of audio processing chains by connecting different ob- ing the puck, keeping the orientation towards the centre of jects (such as generators and filters), MTCF also permits the table. to use the relations between different pucks and can make them explicit. However, unlike the Reactable, MTCF is not limited to the creation of modular, subtractive synthesis processing chains; any object can relate to any other object independently of their nature. This allows for example to easily create and fully control a tangible Frequency Modula- tion synthesiser, by assigning each carrier or each modulator oscillator to a different physical object; or a Karplus-Strong plucked string synthesiserby controlling the extremes of a Figure 3: Tangibles with different feedbacks and virtual string with two separate physical objects. controllers. On the counterpart, MTCF does not yet permit dynamic patching [10], so it is not capable of producing a fully func- Objects’ bars convey a value between 0 and 1 that can be tional Reactable clone, neither was this its main objective. changed by rotating the tangible. The finger slider, repre- In MTCF, the connections between the pucks have to be sented by a thinner line with a dot that can be moved using made explicitly by the programmer in the Pd program- a finger, also ranges between 0 and 1. In the next section ming phase. This is attained by using [objDistance m n], we will concentrate on the Pd side of MTCF. which continuously updates about the status of this connec- tion, and (if existent) about the angle and distance between 2.3 Using MTCF from Pure Data objects m and n. The programmer can also specify whether MTCF was designed to be used along with Pd, as this has she wants this distance parameter to be drawn on the table become one of the most popular languages for realtime au- by sending a Boolean value into the [objDistance] inlet. dio processing and programming. The main idea of this framework was to allow expert Pd users to interface their 2.3.2 Drawing Waves patches using a tangible tabletop setup. For this, MTCF Also inspired by the Reactable, MTCF can easily show the provides nine Pd abstractions that transparently communi- ”sound waves” going from one object to another. This can cate with MTCF, and that are used to define the objects, be achieved by using the [connectWave] object. This ab- the relations between them, and the data the programmer straction takes two parameters that indicate the two object wants to capture from the tabletop interface. Not all of numbers between which the wave should be drawn. As in- these abstractions have to be always used, as this will de- dicated before, this waveform does not necessarily indicate pend on the affordances of our musical application interface. the sound coming from one object into the other, but can rather represent the sound resulting from the interaction basicTangibleTabletop localhost between two combined objects, or from any other sound Object 10 Fingers thread from the Pd patch. An audio inlet and an outlet are used to take the wave- objDistance 20 28 finger form and to act as a gate, allowing the audio to pass, only connectWave 11 12 BGchanger if the two pucks (and therefore the waveform) are on the surface. This ensures that no unintended sound will be pro- drawWave FCchanger cessed neither shown when its control objects are removed. Additionally, a control inlet lets the patch to activate and Figure 4: MTCF Pd Abstractions. deactivate this connection. This way of drawing waveforms has some consequences: Only one abstraction is mandatory and responsible for first, waveforms are drawn by default between pucks, dif- all OSC communication between the Pd patch and MTCF: ficulting the drawing of waveforms between two arbitrary [basicTangibleTabletop]. Its single argument is the ad- points, or from one object to the centre, as Reactable does. dress of the computer running MTCF. This will typically This can be overcome by using a simpler Pd abstraction, be localhost, although changing this address can be useful [drawWave], which has exactly this very purpose: drawing in some situations, such as in testing several projects (on waves between two points. different laptops) wit only one tabletop (only running the The second but very important consequence is that the visual part). One and only one instance of this object must audio connection between two physical pucks is a Pd ob- exist in the Pd program. ject itself. Instead of having Pd audio connections between [Object] abstractions, the programmer must therefore use 2.3.1 Defining Objects and Parameters [connectWave] abstractions, which simply send the wave- Some additional abstractions will allow us to define what form information to MTCF for drawing it. This can be physical pucks will be used on the application. Instantiating confusing, specially when chaining multiple physical pucks [Object n] will tell the system to include the object with imitating an audio processing chain, since the programmer the id code n. must then consider all the possible combinations (Fig. 5). As described in the previous section, a slider plus a [0, 1] rotatory parameter can be activated around any puck. The 2.3.3 Extra features (de)activation of these extra controllers is done in Pd, by For more advanced interaction, additional abstractions are sending messages to their associated [Object]. Only when also provided. [Fingers] gives full information of the posi- these elements are active Pd will receive this additional in- tion of all fingers detected on the table, while [finger] can formation. be used to extract individual fingers information (see Fig. Outlets in [Object] output the presence of the puck (Bool- 6). These abstractions can be used to control less obvious ean), its position, orientation, and if activated, its slider and parameters. 459 Proceedings of the International Conference on New Interfaces for Musical Expression, 30 May - 1 June 2011, Oslo, Norway source filter sink 5. REFERENCES noise~ 1 4 3 [1] R. Bencina, M. Kaltenbrunner, and S. Jordà. (optional) Improved topological fiducial tracking in the connection 1->4 reactivision system. In Computer Vision and Pattern connectWave 1 4 objDistance 1 4 Recognition-Workshops, 2005. CVPR Workshops. - 0.1 objDistance 4 3 IEEE Computer Society Conference on, page 99. Ieee, Object 4 2005. * 10000 - 0.1 == 0 [2] Y. Fernaeus, J. Tholander, and M. Jonsson. Beyond * 10 connection 1->3 representations: towards an action-centric perspective in case there is no object 4 on tangible interaction. International Journal of Arts connectWave 1 3 bp~ 400 10 connection 4->3 and Technology, 1(3):249–267, 2008. connectWave 4 3 [3] L. Fyfe, S. Lynch, C. Hull, and S. Carpendale. Surfacemusic: Mapping virtual touch-based dac~ dac~ instruments to physical models. In Proceedings of the 2010 conference on New interfaces for musical expression, pages 360–363. Sydney, Australia, June Figure 5: A processing chain example. Puck 1 is 2010. a noise generator, puck 4 is a filter, and puck 3 is [4] J. Hochenbaum, O. Vallis, D. Diakopulos, J. Murphy, an audio sink (i.e. the speakers). The programmer and A. Kapuy. Designing expressive musical interfaces must consider the connections both when puck 4 is for tabletop surfaces. In Proceedings of the 2010 present (1 → 4 → 3) and when it is not (1 → 3). conference on New interfaces for musical expression, pages 315–318. Sydney, Australia, June 2010. Fingers [5] S. Jordà. Sonigraphical instruments: from fmol to the finger 0 0 reactable. In Proceedings of the 2003 conference on New interfaces for musical expression, NIME ’03, finger 0 0 pages 70–76, Singapore, Singapore, 2003. National finger 0 0 University of Singapore. [6] S. Jordà. On stage: the reactable and other musical Figure 6: A Pd structure to receive information of tangibles go real. International Journal of Arts and the several fingers on the surface. Technology, 1:268–287, 2008. [7] S. Jordà, G. Geiger, M. Alonso, and M. Kaltenbrunner. The reacTable: exploring the Two additional abstractions can be used for visual pur- synergy between live music performance and tabletop poses: [BGchanger] and [FCchanger] respectively allow tangible interfaces. In Proceedings of the 1st changing the background colour of the tabletop and the international Conference on Tangible and Embedded colour of the fingers’ trailing shadows. Changing colours, interaction, pages 139–146. ACM, 2007. for example according to audio features, can create very [8] M. Kaltenbrunner and R. Bencina. reacTIVision: a compelling effects. computer-vision framework for table-based tangible interaction. In Proceedings of the 1st international 3. CONCLUSIONS conference on Tangible and embedded interaction, pages 69–74. ACM, 2007. The experience we have gained until now from using MTCF on two short half-day workshops, indicate that MTCF is [9] M. Kaltenbrunner, T. Bovermann, R. Bencina, and not only a very valuable tool for the quick development E. Costanza. Tuio-a protocol for table based tangible and prototyping of musical tabletop applications, but also user interfaces. In Proceedings of the 6th International an interesting system for empowering discussion and brain- Workshop on Gesture in Human-Computer storming over some concepts of software synthesis control Interaction and Simulation (GW 2005), Vannes, and interaction. France, 2005. We are also aware that there are many issues that can [10] M. Kaltenbrunner, G. Geiger, and S. Jordà. Dynamic still be improved. While Pd experts quickly understand patches for live musical performance. In Proceedings the framework’s mechanisms and take full profit from it of the 2004 conference on New interfaces for musical producing interesting results in very short times, a few ad- expression, NIME ’04, pages 19–22, Singapore, vanced users missed some higher level control possibilities. Singapore, 2004. National University of Singapore. At its current stage, MTCF is clearly very oriented towards [11] M. Puckette. Pure Data: another integrated computer real-time sound synthesis and processing control, lacking of music environment. Proceedings of the Second higher level and more structural controls that could com- Intercollege Computer Music Concerts, pages 37–41, municate with Pd entities such as data arrays or sequences. 1996. In a near future, we therefore plan to include more graph- [12] J. Schöning, P. Brandl, F. Daiber, F. Echtler, ical interface features, probably making a more extensive O. Hilliges, J. Hook, M. Löchtefeld, N. Motamedi, use of multi-touch interaction, in order to be able to con- L. Muller, P. Olivier, et al. Multi-touch surfaces: A trol time-oriented and structured data such as envelopes or technical guide. Technical Reports of the Technical sequences of events. University of Munich, 2008. [13] M. Wright and A. Freed. Open sound control: A new protocol for communicating with sound synthesizers. 4. ACKNOWLEDGMENTS In Proceedings of the 1997 International Computer This work has been partially supported by TEC2010-11599- Music Conference, pages 101–104, 1997. E (Ministerio de Ciencia e Innovación, Gobierno de España) and by Microsoft Research Cambridge. 460