Abstract
This paper presents ImmertableApp, an innovative multimodal interface based in tangible interaction in which audio edition is managed through physical controllers. The system is composed by two different main components: a tangible tabletop interface in which the sound parameters can be changed by the manipulation of physical controllers; and a graphic editor interface for setting the configuration of the controllers and their corresponding parameters through a tablet device. In this way ImmertableApp adds to a musical tangible interface the new possibilities of software interfaces: personalization for being adapted to different users, either experts or beginners and, so that it can be used as a didactic tool of different concepts related with sound synthesis. The system has been early assessed with experts in order to obtain feedback about its utility in different fields of music education. The results of the evaluation give the basis for interesting improvements in future versions.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Since the development of the MIDI protocol in 1982 there are plenty of sound control and generation hardware and software. The most popular systems to generate real-time interaction are hardware systems that have no problem with real-time interaction but are very constraint in terms of use and interaction due to the specific design of the device: number of potentiometers, buttons, actuators… On the other hand, software-based systems, although very much powerful, are always limited by the traditional interaction with a mouse. This is why, in most cases, musicians, composers and DJs prefer hardware interaction devices to control their music creation applications. Nevertheless, when training novel users the most used applications are controlled by a mouse and a keyboard, or are tactile in the best-case scenario. Due to this virtualization of the sound generation software and devices, users lost the haptic “touch” sensation with the devices, that now are simple represented as images of faders and regulators projected in a screen by the sound software in order to replicate the aspect of their physical equivalents. The relationship between the layout of the different virtual controls and the physical controls of the sound control device, therefore, has been lost. The non-expert user often finds his/herself in front of a window saturated with controls and he/she is not able to comprehend and make use of all the possibilities that are been offered. Moreover, in a graphical interface users usually interact with just one controller, while in physical device it is possible to interact with several controls at the same time.
In order to fill this gap, in this work we propose the use of tangible interaction as a way of recovering the same possibilities that the physical controls offer to the sound edition and generation with the advantages of having also digital information, processing and visualization features. The proposed system is flexible and configurable through a graphical editor, and makes it easy the development of musical educative applications oriented to different kinds of users: children, physical of cognitive disabled people, old people, educators, therapists…
The reminder of the paper is organized as follows. Section 2 presents the state of the art in tangible interaction and music. In Sect. 3, the whole environment is described, giving details about the both main components: the graphical editor and the tangible environment. Following, Sect. 4 presents the results obtained after doing an early evaluation with experts, and finally, the Sect. 5 is focused in the conclusions and the lines of future work that emerged from the assessment.
2 Background and Related Work
In the last years, new interaction techniques have been arisen. These kinds of interaction styles go beyond keyboard, mouse or touch screens, and nowadays are being used in different areas in order to achieve a more natural communication between computer systems and users. Tangible User Interfaces (TUI) are a particular case of natural interaction which joins physical and digital worlds [1] by the manipulation of everyday objects for controlling and representing digital information.
Within TUIs, the tabletops or interactive surfaces digitally augmented [2–4] opened the door to new ways of interaction with computers, more natural and social, since the interactive space of the surfaces is especially suitable for the development of educational and collaborative applications. In fact, most of the works based on physically object manipulation show important benefits in users with difficulties for accessing to conventional technologies, as very young children [5], users with disabilities [6] and elderly persons [7].
In the last years, tabletop solutions have been used in different areas and scopes. Particularly, they offer an interesting tool for the design of musical interfaces, either for execution, composition or control of digital musical instruments [8, 9]. In this area, one of the most well-known tangible interfaces in music creation and execution is Reactable [4, 10]. The instrument is based on tabletop and the manipulation of specific objects over the surface allow performers to combine different elements like synthesizers, effects, sample loops or control elements in order to create a new composition. Although it offers a lot of possibilities, it is developed for being used by DJs or expert musicians, and the control is done with not intuitive controllers. In a similar way, AudioCubes [11] is a tangible user interface that allows to explore and to create dynamically changing sound. Each cube implements a DSP and a sound generator or different musical sound processing techniques. The relative position in space of the different cubes fixes the interaction between the generators or processing cubes. However, the hardware complexity is very high, since each cube has a DSP, an infrared communication module for audio interchange between modules. And the creation process is not intuitively obtained.
Creative musical expression is also the aim of Nielsen et al. [12], who uses modular robotics to create a platform oriented to the music creation process and the control of Ableton Live© software. In this case, the music is prerecorded and depending on the position of the physical building blocks different sounds or styles are played. There is no sound generation process or sound synthesis. A validation of the system has been developed showing that “utilizing music technology in music education and out of school, may give children, or adults, the opportunity to be musically creative and create understanding of musical phenomena and structures”.
More recently, Potidis and Spyrou present Spyractable [13], an evolution of Reactable, with a different interface and position and location of the objects but devoted to the sound synthesis process. The objects are “reactable-like” tokens and no didactic or learning approach is presented.
From the analysis of the studied works it can be deduced that, although there are several precedents of the use of musical applications with tangible interaction, in most of the cases the profile of their final users are expert musicians, composers or DJs, and almost all of them used cube objects in order to control the sound. There are also some other tabletop interactive systems for learning or playing with music, especially developed for children and novel users. Most of them are based in cubes or “pucks” [14, 15], objects representing instruments [16], cards [17] or tactile interaction [18]. However, either in the case of applications for experts or for children, none of them considered the usual controllers for music edition for the management of the sound parameters, and none of them allow that the musical environment could be previously configured and personalized. ImmertableApp, instead, propose to include the potential of object manipulation for teaching the didactic aspects of music, adapting it to different user profiles and different levels of knowledge, in an easy and natural way.
3 Description of the Environment
ImmertableApp environment is composed by two main different components (see Fig. 1): a graphic music configuration editor and the tabletop tangible application with physical controllers.
Final users of the tabletop application (usually children and students) can create and modify sounds by manipulating the tangible musical controllers over the interactive surface. Each controller has a fiducial marker that allows that is recognized by the reacTIVision library [19] and processed by the ToyVision framework [20]. In this way, the manipulations produced in the surface of the tabletop generate variations of the synthesis sound parameters, which are processed and can be heard in real time through the sound synthesis generated by an API implemented with the PureData application [21]. The ways in which the controllers will modify the different parameters and sound properties are previously set (usually by teachers) using the graphic editor interface developed for a tablet device.
In the following sub-sections the different components of the system are explained in detail. First, the graphical editor interface is described and later, the tangible interactive environment, detailing the tabletop, the physical controllers and the software developed for managing all.
3.1 The Graphic Editor Interface
The graphic editor interface is an interactive tactile application that runs over a tablet device (Android or iOS). The user, with drag and drop interaction, can configure the interactive environment of the musical application: he/she can define the different control objects involved in the activity and their layout, and also the sound properties he/she wants to associate to each controller for managing the musical creation (see Fig. 2). In this way, the graphical editor allows to personalize and create musical activities in order to generate different modalities of sound generation and edition depending on the user knowledges, or with didactic and learning aims, for teaching musical concepts to students of different levels.
Once the configuration of the controllers and properties for a specific activity are defined, the project is saved in two different XML files. These configuration files will be stored, sent to, and later processed by the ToyVision framework [20] for the correct association of the changes in the sound and the visualization of the activity over the tabletop interface.
The editor interface was designed taking into account the recommendations of music teachers and following the style guides for the development of tactile applications. The User Interface was implemented in Adobe Flex, with AdobeAIR technology, in order to be multiplatform and to be executed either in an Android or an iOS device.
3.2 The Tangible Interactive Environment
The tangible interactive surface is based in NIKVision tabletop [5] (see Fig. 3-left) that has been adapted for filling our requirements (see Fig. 3-right).
Technically, the tabletop uses the computer vision framework reacTIVision available for free as open-source software [19] that tracks the objects placed on the surface, provided by a fiducial marker attached to the base of each controller. An infrared light USB camera captures video from underneath the table and streams it to the computer station that executes the visual recognition and the visualization and audio generation software. Active image projection on the table is provided by retro-projection through a mirror inside the table.
For audio generation in the tabletop, ImmertableApp incorporates a system based in PureData [21] and integrated with the ToyVision framework [20] for the management of the audio events and the corresponding visualizations through an Application Programming Interface (API) that allows programmers to easily access to the information coming from the tablet editor and to the tangible controllers.
ToyVision framework reads the XML files previously generated with the graphic editor interface, renders the corresponding visualization over the tabletop surface and change the necessary data into the OSC format in order to be managed by the sound engine. Figure 4 shows the tabletop surface with the visualization of the background layout previously create and with an oscillator physical controller over it. The background color projected into the surface changes according to the fundamental frequency of the sound.
As it was explained before, the PureData software was used to develop the sound management. This approximation allows us to implement a sound generator inside the system without the need of another computer or even an external sound module.
For the purpose of sound generation understanding an additive sound synthesis [22] was implemented. Each sound is generated by the weighted sum of a number of harmonics. Then a global envelope with a typical ADSR (attack, decay, sustain and release) pattern is applied to the whole sound. The envelope parameters, the harmonics amplitude, the global volume, the fundamental frequency (pitch) and the duration of the sound are the main characteristics that can be modified by the user by means of the controller objects.
Set of Tangible Controllers.
The tangible controllers are conceived as replicas of usual physical controllers used in sound devices and allow to do musical control actions with better response time and feedback that virtual systems based in tactile interaction or image recognition [23].
Within edition and musical composition context there is a well-known set of physical controllers. Therefore, in this work, the design of the interactive objects is based on the physical devices that are usually in synthesis, generation, music and sound composition applications, like faders, potentiometers, buttons, actuators… In this way, we decided to create an intuitive set of controllers, with flexibility and expressivity, for the configuration of the new interactive environment. Each tangible controller set (see Fig. 5) is formed by one or more controllers, which are the physical elements in charge of modifying different sound parameters.
The objects have been prototyped using a 3D printer and, some of them include LEDs illumination and Arduino’s sensors and actuators. The different generated objects for sound control are (see Fig. 6):
-
Oscillator: It produces a periodic sound signal (wave). There are 4 kinds of waves: sinusoidal, square, triangular and saw-tooth.
-
Switch: Button with two states: active/non active.
-
Knob: Regulator by 360º turn that allow to increase or decrease a digital variable according to the sense of turning.
-
Fader: Vertical or horizontal slider that allow to regulate the value of a digital variable between a minimum and a maximum.
-
Regulator: Wheel with an equilibrium position, movable between a minimum and maximum position.
-
Touch-pad: Rectangular tactile surface in which the user can regulate a bi-dimensional variable dragging with the finger or with a small piece within the surface. It allows to work with harmonics and envelops.
The users will make up their different spatial configurations joining several controllers (each one has magnetized laterals) and placing them over the visualization projected in the surface of the tabletop (see Fig. 7). Right now, only one oscillator, that defines the basic sound, is allowed for each set of controllers. All the changes in the sound properties are made over this basic sound.
4 Evaluation and Discussion: First Results
Given the didactic orientation of our system, in this first version we decided to do an informal evaluation of ImmertableApp, only with experts, in order to obtain an early feedback of the utility of the system in different fields of music education. Ten music teachers who work at different level of education (from preschool to university, including special education) attended to a two hours evaluation session in our lab. Four researchers acted as observers in this session, taking notes and recording the comments and discussions between the experts.
The main components of ImmertableApp were tested. The assessment was conducted with several aims in mind. First of all, the value of the tabletop interface as a didactic tool was investigated. Secondly, the usability of the graphic editor was explored. And, finally, the design of the graphic editor and the physical controllers was analyzed.
In the session, first, a presentation of 10 min was given to the music teachers for offering a complete overview of the components. Then, teachers used and played with the tabletop for other 10 min. To assess the didactic possibilities of the tool three activities were carried out: a focus group, a team work and the fulfillment of a questionnaire.
The focus group session lasted 40 min. All the teachers analyzed and discussed the pros and cons of the system, specially focusing in the possibility of using the tabletop device in their classrooms. The teachers most interested in the use of the tabletop were those of special education area, in particular those working with deaf students, autistic children, or very young kids. In particular a teacher of a school for deaf children showed great interest to keep working on this project due to the potentialities that the tabletop offered, since it could allow to “see” the music (for example, by changing the color of the surface depending on the sound that was being reproduced). Most part of her students had partial deafness, and consequently she found really attractive to be able to touch and manipulate the sound parameters by using the controllers.
Following, the experts were divided into two groups of 5 persons each, with the aim of defining, in 40 min, possible didactic activities (for different educational levels) to carry out with the system (editor and tabletop). The results of the two groups were quite different: one rapidly reached a solution that could be easily modulated in complexity in order to be used from primary school to university; the other group, instead, took almost all the time in generating just a proposal since they were continuously engaged into dialogues.
Finally the teachers were given a questionnaire to measure the value of the tabletop interface as a didactic tool. It covered different aspects: functional, pedagogical, esthetic aspects, and accessibility. The results of these enquiries reveal that, regarding functional aspects, the tabletop interface is perceived as easy to use, and that it offers didactic efficacy and flexibility. In the pedagogic aspects the tabletop is considered as a very motivating tool but they also detected potential problems for working in classrooms with more than 10 students. This would be a problem except in special education or very young children classes since they used to work in small groups. In general, as a solution to this problem, they proposed to project the tabletop surface in a whiteboard, or to use it as a collaboration tool only for a small number of students (not more than 5), placing it at tutorial classes, or setting it as rotational activity. Concerning esthetical aspect, almost 60 % of the teachers considered that the tabletop is attractive, but the rest have doubts about its design. Finally, almost all the teachers believe that the tabletop favors accessibility since it takes into account different communication codes.
In order to evaluate the usability of the graphic editor, the teachers had to fill a SUS questionnaire [24] (a popular method to evaluate perceived usability) about their experience while using the editor. The mean of the questionnaires was 76.75 (being 100 the perfect score), assessing a rather high user satisfaction while using the graphic editor. However, there are some aspects that have to be improved. For example, the editor should be more intuitive to use, since almost all the teachers agreed with the fourth question: “I think that I would need the support of a technical person to be able to use this system”. For a correct verification of the graphic editor usability, a special session with a specific task, like re-creating a classroom activity, must be carried out.
And, finally, about the design of the physical controllers objects teachers said that they were very intuitive and very easy to manipulate although they were a little small for some special cases, like children with motor disabilities. This fact has to be deeper studied in each classroom and group before conducting an evaluation involving children.
5 Conclusions
The system developed in this article is a step in the improvement of teaching and learning music edition without losing the sensations felt using physical controllers through a tangible tabletop interface. The presented system goes further nowadays works allowing the design and personalization of new musical activities by using a tablet to generate them, in and intuitive and simple way. So that, two different components formed this system: a graphical editor and a tabletop interface with physical objects for controlling music edition. Both parts of the system have been early evaluated and the results are very encouraging. The experts highlighted the potential for working with groups with few children, for students with motor or cognitive limitations, and especially for deaf children. However, in this early assessment some aspects result susceptible to be improved, and must be verified with a more rigorous and systematic evaluation. The graphic editor has to be used by teachers for the design of their daily musical activities and later, the activities must be evaluated using the tabletop and the controllers in the classrooms, like in similar works that use tangible interaction with children [25–27]. Moreover, it would be very interesting to use MINUET [9], a framework for musical interface design, in order to position our work in a structured design space, to elaborate ideas and objectives when designing a new musical interface and to guide the evaluation process. This could be a helping tool for analyzing our development.
On the other hand, the physical controllers designed in this work only support passive interaction (they react to users’ actions). However, in a near future, they could include active interaction (produced by the system). For this purpose, micro-controllers connected to electronic sensors and actuators and a wireless communication module will be used [28]. In this way, new types of sensors, such as pressure sensors, capacitive sensors and sensors that can vibrate as controllers will be included, making the interaction richer and more motivating.
References
Ullmer, B., Ishii, H.: Emerging frameworks for tangible user interfaces. IBM Syst. J. 39(3.4), 915–931 (2000)
Smithson Martin. https://smithsonmartin.com/products/emulator-elite/. Accessed 15 Feb 2016
Smartable: Gorenje design studio. http://www.smar-table.com/en. Accessed 15 Feb 2016
Kaltenbranner, M., Jorda, S., Geiger, G., Alonso, M.: The reactable*: a collaborative musical instrument. In: 15th IEEE International Workshops on Enabling Technologies: Infrastructure for Collaborative Enterprises, WETICE 2006, pp. 406–411 (2006)
Marco, J., Cerezo, E., Baldassarri, S., Mazzone, E., Read, J.C.: Bringing tabletop technologies to Kindergarten children. In: Proceedings of the 23rd British HCI Group Annual Conference on People and Computers: Celebrating People and Technology, pp. 103–111. British Computer Society (2009)
Li, Y., Fontijn, W., Markopoulos, P.: A tangible tabletop game supporting therapy of children with cerebral palsy. In: Markopoulos, P., de Ruyter, B., IJsselsteijn, W.A., Rowland, D. (eds.) Fun and Games 2008. LNCS, vol. 5294, pp. 182–193. Springer, Heidelberg (2008)
Al Mahmud, A., Mubin, O., Shahid, S., Martens, J.B.: Designing and evaluating the tabletop game experience for senior citizens. In: Proceeding of the NordiCHI, 20–22 October 2008
Lyons, M.J., Mulder, A., Fels, S.: Introduction to designing and building musical interfaces. In: Proceedings of the Extended Abstracts of the 32nd Annual ACM Conference on Human Factors in Computing Systems. ACM (2014)
Morreale, F., De Angeli, A., O’Modhrain, S.: Musical interface design: an experience-oriented framework. In: Proceedings of NIME, vol. 14, pp. 467–472 (2014)
Jordà, S., Geiger, G., Alonso, M., Kaltenbrunner, M.: The reactable: exploring the synergy between live music performance and tabletop tangible interfaces. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction, pp. 139–146. ACM (2007)
Schiettecatte, B., Vanderdonckt, J.: AudioCubes: a distributed cube tangible interface based on interaction range for sound design. In: Proceedings of the 2nd International Conference on Tangible and Embedded Interaction, pp. 3–10. ACM (2008)
Nielsen, J., Bærendsen, N.K., Jessen, C.: RoboMusicKids. In: IEEE International Workshop on Digital Game and Intelligent Toy Enhanced Learning, DIGITEL 2008, pp. 149–156 (2008). doi:10.1109/DIGITEL.2008.25
Potidis, S., Spyrou, T.: Spyractable: a tangible user interface modular synthesizer. In: Kurosu, M. (ed.) HCI 2014, Part II. LNCS, vol. 8511, pp. 600–611. Springer, Heidelberg (2014)
Parra-Damborenea, J.: Reactblocks: A 3D tangible interface for music learning. Master thesis, University of Pompeu Fabra (2014)
Costanza, E., Shelley, S.B., Robinson, J.: Introducing audio d-touch: a tangible user interface for music composition and performance. In: Proceedings in Human Computer Interaction (HCI). ACM (2003)
Bischof, M., Conradi, B., Lachenmaier, P., Linde, K., Meier, M., Pötzl, P., André, E.: Xenakis: combining tangible interaction with probability-based musical composition. In: Proceedings of the 2nd International Conference on Tangible and Embedded Interaction, pp. 121–124. ACM (2008)
Francesconi, J.I., Larrea, M., Manresa-Yee, C.: Tangible music composer for children. J. Comput. Sci. Tech. 13, 84–90 (2013)
Patten, J., Recht, B., Ishii, H.: Audiopad: a tag-based interface for musical performance. In: Proceedings of the 2002 Conference on New Interfaces for Musical Expression (2002)
Reactivision. http://www.reactivision.com. Accessed 15 Feb 2016
Marco, J., Baldassarri, S., Cerezo, E.: ToyVision: a toolkit to support the creation of innovative board-games with tangible interaction. In: Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction, pp. 291–298. ACM (2013)
PureData. https://puredata.info/. Accessed 15 Feb 2016
Roads, C.: The Computer Music Tutorial. MIT press, Cambridge (1996)
Schöning, J., Brandl, P., Daiber, F., Echtler, F., Hilliges, O., Hook, J., von Zadow, U.: Multi-touch surfaces: a technical guide. IEEE Tabletops Interact. Surf. 2, 11 (2008)
Brooke, J.: SUS-A quick and dirty usability scale. Usability Eval. Ind. 189(194), 4–7 (1996)
Villafuerte, L., Markova, M., Jorda, S.: Acquisition of social abilities through musical tangible user interface: children with autism spectrum condition and the reactable. In: CHI 2012 Extended Abstracts on Human Factors in Computing Systems, pp. 745–760. ACM 2012
Chen, W.: Multitouch tabletop technology for people with autism spectrum disorder: a review of the literature. Procedia Comput. Sci. 14(2012), 198–207 (2012)
Nikolaidou, G.N.: ComPLuS model: a new insight in pupils’ collaborative talk, actions and balance during a computer-mediated music task. Comput. Educ. 58(2), 740–765 (2012)
Marco, J., Cerezo, E., Baldassarri, S.: Lowering the threshold and raising the ceiling of tangible expressiveness in hybrid board-games. Multimedia Tools Appl. 75(1), 425–463 (2016)
Acknowledgements
Authors want to thank the ideas given by the ten music teachers that participated in the evaluation session and the support and collaboration of the Education Faculty professors of the University of Zaragoza: Marta Liesa, Sandra Vázquez and Ana Cristina Blasco. This work has been partly financed by the Spanish “Ministerio de Economía y Competitividad” through project No. TIN2015-67149-C3-1R and by the “Diputación General de Aragón” through project: ImmertableApp No. 1004460/2015.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Baldassarri, S., Marco, J., Bonillo, C., Cerezo, E., Beltrán, J.R. (2016). ImmertableApp: Interactive and Tangible Learning Music Environment. In: Kurosu, M. (eds) Human-Computer Interaction. Novel User Experiences. HCI 2016. Lecture Notes in Computer Science(), vol 9733. Springer, Cham. https://doi.org/10.1007/978-3-319-39513-5_34
Download citation
DOI: https://doi.org/10.1007/978-3-319-39513-5_34
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-39512-8
Online ISBN: 978-3-319-39513-5
eBook Packages: Computer ScienceComputer Science (R0)