Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Towards Ambient Augmented Reality with Tangible Interfaces

Ambient Interface research has the goal of embedding technology that disappears into the user’s surroundings. In many ways Augmented Reality (AR) technology is complimentary to this in that AR interfaces seamlessly enhances the real environment with virtual information overlay. The two merge together in context aware Ambient AR applications, which allow users to easily perceive and interact with Ambient Interfaces by using AR overlay of the real world. In this paper we describe how Tangible Interaction techniques can be used for Ambient AR applications. We will present a conceptual framework for Ambient Tangible AR Interface, a new generation of software and hardware tools for development and methods for evaluating Ambient Tangible AR Interfaces.

Towards Ambient Augmented Reality with Tangible Interfaces Mark Billinghurst, Raphaël Grasset, Hartmut Seichter, and Andreas Dünser The Human Interface Technology New Zealand (HIT Lab NZ), University of Canterbury, Private Bag 4800, Christchurch, New Zealand {mark.billinghurst, raphael.grasset, hartmut.seichter, andreas.duenser}@hitlabnz.org Abstract. Ambient Interface research has the goal of embedding technology that disappears into the user’s surroundings. In many ways Augmented Reality (AR) technology is complimentary to this in that AR interfaces seamlessly enhances the real environment with virtual information overlay. The two merge together in context aware Ambient AR applications, which allow users to easily perceive and interact with Ambient Interfaces by using AR overlay of the real world. In this paper we describe how Tangible Interaction techniques can be used for Ambient AR applications. We will present a conceptual framework for Ambient Tangible AR Interface, a new generation of software and hardware tools for development and methods for evaluating Ambient Tangible AR Interfaces. Keywords: Augmented Reality, Ambient Interfaces, Tangible Interfaces. 1 Introduction One of the overarching goals of human computer interaction is to the make the computer vanish and to allow technology to invisibly assist people in their everyday real world tasks. Over the last several decades there have been a number of compelling visions presented showing how this may be achieved, such as Weiser’s concept of Ubiquitous Computing [1], Norman’s Invisible Computing [2] and Dourish’s ‘Embodied Interaction’ [3]. Similar to this earlier work, Ambient Intelligence [4] has the goal of embedding context technology that disappears into the users surroundings. Ambient Intelligence or AmI typically refers to electronic environments that are sensitive and responsive to the presence of people. In developing invisible interfaces, providing information display back to the user is a key element. Many AmI applications use traditional screen based or projected displays. However, one of the more interesting approaches to information display is through Augmented Reality. Augmented Reality (AR) applications are those in which three-dimensional computer graphics are superimposed over real objects, typically viewed through headmounted or handheld displays [5]. In many ways Augmented Reality technology is complimentary to AmI in that AR interfaces seamlessly enhance the user’s real environment with virtual information J.A. Jacko (Ed.): Human-Computer Interaction, Part III, HCII 2009, LNCS 5612, pp. 387–396, 2009. © Springer-Verlag Berlin Heidelberg 2009 388 M. Billinghurst et al. Fig. 1. An Ambient AR interface showing real time temperature information superimposed over the real world overlay. The two merge together in context aware Ambient AR applications, which allow users to easily perceive and interact with Ambient Interfaces by using AR overlay of the real world. For the time being display and tracking technologies are only stepping stones for achieving truly non-intrusive interfaces. Ambient AR applications are those which use AR technology to represent context information from an Ambient Interface. For example, Rauhala et. al. [6] have developed an AR interface which shows the temperature distribution of building walls. In this case they embedded temperature sensors in room walls, and then wirelessly sent temperature information to a mobile phone AR interface. When the user pointed their phone at the wall, on the phone screen they could see a virtual image of the current temperature distribution superimposed over a live video of the wall. In this way AR technology provides a natural way to make visible the invisible context information captured by the Ambient Interface application. Although AR technology is very promising there is still a lot of research than needs to be conducted on how to interact with Ambient AR applications. There has been substantial research conducted in Augmented Reality, much of it has been focused on the underlying technology (such as tracking and display devices), rather than the user experience and interaction techniques. Interaction with AR environments has usually been limited to either passive viewing or simple browsing of virtual information registered to the real world. Few systems provide tools that let the user interact, request or modify this information effectively and in real time. Furthermore, even basic interaction tasks, such as manipulation, copying, annotating, and dynamically adding and deleting virtual objects in the AR scene have been poorly addressed. In our research we are exploring new interaction techniques for Ambient AR interfaces. In this paper we describe how Tangible Interaction concepts can be used to design Ambient AR applications. Users already know how to manipulate real world objects and so by building interaction techniques around object manipulation very intuitive interfaces can be developed. We will describe a Tangible Interface design framework and subsequently demonstrate how this can be used to support the development of Ambient AR Interfaces. Based on this we introduce our in-house authoring tools that can be used to develop Ambient AR interfaces based on Tangible AR techniques and finally look at the methodologies and evaluation concepts relevant for these techniques. Towards Ambient Augmented Reality with Tangible Interfaces 389 2 Related Work Our research work is based on earlier work in the areas of Augmented Reality, Ambient Interfaces and Tangible User Interfaces. Tangible user interfaces (TUI) describe physical objects which are able to translate user actions into input events in the computer interface [7]. Thus, tangible user interfaces make virtual objects accessible to the user through physical proxies. TUIs can be seen as an implementation of ‘direct manipulation’ concepts described in by Shneiderman [8]. Though, not conceived as TUI as such, Shneiderman describes parameters like immediacy and haptic quality as important concepts to foster physical engagement with an object for the purpose of lowering the user’s mental load. Shneidermans’ idea of ‘direct manipulation’ is the transposition of the abstract nature of digital interfaces evolving over time. Over a relatively short period of time more and more features have been accumulated in digital interfaces. In addition, legacy functionality was retained and consequently legacy modes of interaction will remain. In turn new, emerging interfaces will be forced to circumvent ambiguities by reinventing modes of engagement even with tangible objects. Modes of ‘direct manipulation’ therefore need new interfaces or a removal of legacy modes. Shneiderman identifies three core properties of ‘direct manipulation’ [8]: • Continuous representation of the object of interest • Physical actions or labeled buttons pressed instead of complex systems • Rapid incremental, reversible operations whose impact on the object of interest is immediately visible. Hutchins et al. [9] elaborated, on the basis of user observation, that two main aspects of information retrieval through direct manipulation are achieved; one is that the user is relieved from interpreting the representation and consequently is able to focus on the goal rather than the process. Secondly, the mnemonics are tied to an external instance and as such do not change modes. Therefore a direct link between object and action is maintained – a crucial concept for TUI in augmented reality. Tangibility and the need for direct manipulation can be identified as important concepts for interaction with ambient interfaces. Physical representation is only a small part of those concepts. More important is the representation of flow and logic which provide an important clue to the understanding of information. The ‘Universal Constructor’ developed by Frazer et al. [10] used the metaphor of urban structures as a networked system. Actual cubes represent the spatial relationship of autonomous working units interconnected as nodes. Hence, it was essential for the users’ of the installation to understand the flow of information between the nodes; ambient meaning conveyed through the link between the real and virtual objects. Current Tangible interfaces provide very intuitive manipulation of digital data, but limited support for viewing 3D virtual objects. For example, in the Triangles work [11], physical triangles are assembled to tell stories, but the visual representations of the stories are shown on a separate monitor distinct from the physical interface. Presentation and manipulation of 3D virtual objects on projection surfaces is difficult, particularly when trying to support multiple users each with independent viewpoints. In contrast, Augmented Reality technology can be used to merge the display space and interaction space. So we believe that a promising new AR interface metaphor can 390 M. Billinghurst et al. arise from combining the enhanced display possibilities of Augmented Reality with the intuitive manipulation of Tangible User Interfaces. We call this combination Tangible Augmented Reality. In the next section we describe the Tangible AR metaphor in more detail and show how it can be used to provide seamless interaction in Ambient Intelligent interfaces. 3 Ambient AR Interface Conceptual Framework The combination of TUI and AR techniques provides an interaction metaphor which we call Tangible Augmented Reality (Tangible AR) [12]. Tangible AR interfaces are those in which: 1) each virtual object is registered to a physical object and 2) the user interacts with virtual objects by manipulating the corresponding tangible objects. So, in the Tangible AR approach the physical objects and interactions are equally as important as the virtual imagery and provide an intuitive way to interact with the AR interface. One of the most important outcomes of developing the Tangible AR metaphor is that it provides a set of design guidelines that can be used to developed effective interfaces. In designing Tangible AR interfaces there are three key elements that must be considered (Figure 2): (1) The physical elements in the system (2) The visual and audio display elements (3) The interaction metaphor that maps interaction with the real world to virtual object manipulation. A Tangible AR interface provides true spatial registration and presentation of 3D virtual objects anywhere in the physical environment, while at the same time allowing users to interact with this virtual content using the same techniques as they would with a real physical object. So an ideal Tangible AR interface facilitates seamless display and interaction. This is achieved by using the design principles learned from Tangible User Interface, including: • • • • • • • The use of physical controllers for manipulating virtual content. Support for spatial 3D interaction techniques (such object proximity). Support for both time- and space-multiplexed interaction. Support for multi-handed interaction. Matching the physical constraints of the object to the task requirements. The ability to support parallel activity with multiple objects. Collaboration between multiple participants We can extend the previous framework in the context of ambient applications as illustrated in Figure 3. In this environment, the control of a tangible interface can act on sensors or actuators embedded in the environment with Augmented Reality providing a support for visual information. For example, different tangible cubes can be shifted and rotated on the surface of a table to change the lighting intensity and colour in a Towards Ambient Augmented Reality with Tangible Interfaces 391 Fig. 2. Tangible AR Interface Components room. By complementing this with Augmented Reality, the user can have a virtual preview of the final result. Using another interaction metaphor, a user could move a tangible cube on a 2D plane to control the distribution of air flow in a room from real fans. Compared to a more traditional remote command interface, a user will benefit from the more intuitive interaction of a TUI, using spatial constraints and a better control/display mapping (in our scenario, moving the cube in a certain location will concentrate airflow in this location). We can thus re-use the previously described framework in the context of an Ambient Application. However, some inherent characteristics of Ambient Applications can be easily illustrated here. Firstly, the output components will not only be virtual but also physical (actuators, sensors, etc). Secondly, Ambient AR Applications intrinsically use sparse distributions of sensors/actuators within a room, building, or urban environment. Finally, the definition and design of an efficient interaction metaphor will certainly be more challenging since the user can manipulate physical elements in close proximity, and visual feedback can be provided at a distance (e.g. interaction on table vs. sensors mounted in the room). We can thus redraw the previously presented Tangible AR metaphor incorporating these new factors as illustrated Figure 4. Fig. 3. On the left, traditional a Tangible AR application. On the right using a Tangible Interface with an Ambient Augmented Reality Application. 392 M. Billinghurst et al. Fig. 4. Ambient Tangible AR Interface 4 Ambient Augmented Reality – The Tools In order to develop Ambient AR application there is a need for tools for programmers and application developers. Higher level AR authoring tools address the need for interactivity, and thus user input of any kind in AR environments. They are essential in providing a pathway for interface designers to prototype or create ambient interfaces. These tools can be developed on different levels: high level GUI driven feature rich tools and frameworks for programming environments, and low level development libraries for computer vision or input fusion. Mostly, one level sits on top of the other and the various authoring tools are geared towards a certain application domain. 4.1 Authoring Software There are a number of software tools that have been developed for high-level AR authoring. For example, DART[13], a plug-in for Adobe Director, inherently has access to a wealth of pre-existing infrastructure. ImageTclAR [14] introduced a more rigid framework which was only capable of compile time definition of interactions. APRIL [15], in comparison, is addressing the connection between real and virtual environments with a much higher level framework. It provides an extensible AR authoring platform based on XML descriptions. However, interactions are implemented in non-interpretive languages addressed through the XML parser. At the HIT Lab NZ we have developed ComposAR [16], which is unique compared to other AR authoring tools in its ability to support different levels of interaction. We followed a similar approach to an earlier work, Hampshire et al. [17] by extending the notion of a fiducial marker into that of a sensor. The intermediate level of the system implements an Action-Reaction mechanism imitating a Newtonian Physics paradigm. To distinguish the different levels where input and output are connected we describe the chain of events through Sensors, Triggers and Actions. Sensors provide a raw live data-stream into the authoring environment. All physical devices including keyboards, mice and other conventional input devices are sensors. The data provided by sensors is elevated to the state of ``information'' only once it is interpreted by a Trigger, which evaluates the input and decides whether or not to invoke an Action. An example of this process is the monitoring of the visibility of a Towards Ambient Augmented Reality with Tangible Interfaces 393 marker. Currently ComposAR provides some basic interaction methods based on a standard repertoire common in AR applications, including interaction based on fiducial proximity, occlusion, tilting and shaking. Through this very rough abstraction ComposAR can provide a convenient way to create a broad variety of interfaces including ambient displays or simulations. As ambient interfaces interact with low level data this methodology allows us to quickly create demonstrations which react to data from an RSS feed or the tilt-sensor in a desktop computer. As this approach is net-transparent, displays and sensors can be meshed together. 4.2 Hardware Authoring In addition to software authoring tools, for Ambient AR interfaces, there is a need for hardware tools for rapid prototyping. The design and the development of AR Tangible interfaces has demonstrated the need for tools that can easily integrate physical actuators and sensors into Ambient AR applications. By combining a Tangible Interface with intelligible sensors, users can benefit from a new range of design possibilities such as kinetic movement, skin sensitivity, and sustainable power sources. In the past it was difficult to explore such designs because of the high level of hardware skills required and the difficultly of software integration with this technology. However more affordable and intuitive solutions have emerged. Simple programming microcontroller boards (like the Arduino [18]) are available and can be remotely read or controlled from a standard PC. Similarly, USB or wireless components have also become simpler and easier integrated into an electronic interface. In this context, the research community has pursued the goal of creating physical computing elements that can be easily integrated into the user interface. In this section we will present a few examples from this category. The Phidgets toolkit [19] combines a set of different independent sensors that are interfaced through a USB port. A low level software interface allows users to connect them in at run-time and access them through a .NET interface. The CALDER Toolkit [20] introduces a similar approach, but also adds wireless components. The iStuff framework [21] facilitates the support of distributed sensors and actuators through a transparent network interface and an API based on system events. Papier-Mache [22] has support for RFID in an event-based toolkit integrated with other technologies like computer vision tracking. It also has high level tools for the developer, through a visual debugging interface, and monitoring tools. Most of these toolkits are designed for developers with good programming skills and a good understanding of hardware and so are inaccessible for many potential users. Furthermore, few of these toolkits are integrated with more general libraries for developing Augmented Reality applications. One way to make these tools more accessible is by using a visual programming interface. For example, in the Equator project, ECT has a visual interface for building applications that support a large range of physical sensors [23]. Support for AR Applications has recently been added to this library (see [17]). More recently, we also provide support in ComposAR [16] for physical input devices [24]. We have been developing a generic hardware toolkit supporting a large range of physical actuators and sensors. Our toolkit, Pandora's Box, is a multi-language library which uses a client server approach to access different types of hardware toolkits 394 M. Billinghurst et al. (e.g. Arduino, In-house toolkit, T3G, etc.). This tool is being integrated into the ComposAR software toolkit and by doing so we are creating an all-in-one solution for easy development of physical, ambient and visually augmented interfaces. Nonetheless, there are a number of research areas that remain. Providing a transparent interface for multiple hardware devices is still challenging and requires further development and testing. Standardization, and providing a more generic interface for electronic boards and sensors will help with this issue. Research also needs to be conducted on the representation and interaction techniques for giving the user access to these sensors. Issues include how can sensors be visually represented? How can they be easily configured? How can sensors be combined and high level information provided to the end-user in a relevant way? Initial work has been conducted in this area, for example with flow control diagrams, but their generic nature makes them difficult to use for novice programmers or end-users. 5 Design and Evaluation of Ambient AR Systems There is still very little knowledge regarding the proper design and evaluation of AR systems [25, 26] and few general design guidelines for AR interfaces. Most guidelines seem to be rather specific suggestions for the specific design challenges of individual systems. One reason for this is the huge variety of AR system implementations. These systems can be realized with different hardware, tracking technologies and software frameworks. Although we have definitions for what constitutes an AR system [5], these definitions are quite broad. Therefore, deriving common design principles seems to be a challenging task. General HCI guidelines and user-centered design guidelines can serve as a starting point for the development of more general AR design guidelines [26]. However, these must be refined to meet the specific demands of AR systems. At this stage even less is known about suitable guidelines for the development of the systems discussed in this paper. We can use design guidelines derived from AR systems research, and try to combine this with knowledge from TUI design. This might be a good first approach to develop such systems. However, the whole is more that the sum of its parts, so we argue that Ambient AR systems with Tangible Interfaces require a separate paradigm with respect to proper system design. There also is a need for more research on suitable evaluation techniques for Ambient AR interfaces. We found that only few AR related research publications include formal or informal user evaluations. In a recent survey we estimate that only 10% of AR articles published in ACM and IEEE between 1993 and 2007 included some user evaluation [25]. Excluding informal user evaluations (evaluations which did not follow a rigorous evaluation program), the percentage is around 8% which is similar to findings reported by Swan et al. [27]. This relative lack of user evaluations in AR research could be due to a lack of education on how to evaluate such experiences. Again, this is even more the case for Ambient AR with Tangible Interfaces. But here it is also worthwhile to collect knowledge gathered in other disciples and combine this with the specific demands of these systems. More general evaluation techniques and approaches used in HCI can be readily applied here. In the early design stages of novel interfaces exploratory evaluation techniques can be applied with the aim to uncover issues that need further investigation. In later stages these issues can be studied using more rigorous approaches. Towards Ambient Augmented Reality with Tangible Interfaces 395 6 Conclusion We have described in this paper the issues arising from the combination of Augmented Reality, and Ambient and Tangible User Interfaces. The introduced framework is an initial step to be evaluated further with the development of new Ambient AR interfaces. The paper also introduced novel methods of handling interaction within AR authoring tools which are valuable for new Ambient AR interfaces. Finally, we described some issues related to the empirical evaluation of these new types of interfaces, and the challenges in this area. References 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. Weiser, M.: The computer for the 21st century. Scientific American 265, 94–104 (1999) Norman, D.A.: The invisible computer. MIT Press, Cambridge (1998) Dourish, P.: Embodied Interaction: Exploring the Foundations of a New Approach (1999) Aarts, E., Harwig, R., Schuurmans, M.: Ambient intelligence, The invisible future: the seamless integration of technology into everyday life. McGraw-Hill, Inc., New York (2001) Azuma, R.T.: A Survey of Augmented Reality. Presence - Teleoperators and Virtual Environments 6, 355–385 (1997) Rauhala, M., Gunnarsson, A.-S., Henrysson, A.: A Novel Interface to Sensor Networks using Handheld Augmented Reality. In: MobileHCI 2006, pp. 145–148. ACM, Helsinki (2006) Ishii, H., Ullmer, B.: Tangible bits: towards seamless interfaces between people, bits and atoms. In: CHI 1997, pp. 234–241. ACM, New York (1997) Shneiderman, B.: The future of interactive systems and the emergence of direct manipulation. Behaviour and Information Technology 1, 237–256 (1982) Hutchins, E.L., Hollan, J.D., Norman, D.A.: Direct manipulation interfaces. Hum.Comput. Interact. 1, 311–338 (1985) Frazer, J.H., Frazer, J.M., Frazer, P.A.: Intelligent physical three-dimensional modeling systems. In: Computer Graphics Conference, vol. 80, pp. 359–370 (1980) Gorbet, M.G., Orth, M., Ishii, H.: Triangles: tangible interface for manipulation and exploration of digital information topography. In: CHI 1998, pp. 49–56. ACM, New York (1998) Kato, H., Billinghurst, M., Poupyrev, I., Tetsutani, N., Tachibana, K.: Tangible Augmented Reality for Human Computer Interaction. Nicograph, Nagoya, Japan (2001) MacIntyre, B., Gandy, M., Dow, S., Bolter, J.D.: DART: a toolkit for rapid design exploration of augmented reality experiences. In: Proceedings of the 17th annual ACM symposium on User interface software and technology. ACM, Santa Fe (2004) Owen, C., Tang, A., Xiao, F.: ImageTclAR: A Blended Script and Compiled Code Development Systems for Augmented Reality (2003) Ledermann, F., Schmalstieg, D.: APRIL: A High-level Framework for Creating Augmented Reality Presentations. In: Proceedings of the IEEE Virtual Reality 2005 (VR 2005), pp. 187–194 (2005) Seichter, H., Looser, J., Billinghurst, M.: ComposAR: An Intuitive Tool for Authoring AR Applications. In: Saito, M.A.L., Oliver, B. (eds.) International Symposium of Mixed and Augmented Reality (ISMAR 2008), pp. 177–178. IEEE, Cambridge (2008) 396 M. Billinghurst et al. 17. Hampshire, A., Seichter, H., Grasset, R., Billinghurst, M.: Augmented Reality Authoring: Generic Context from Programmer to Designer. In: Australasian Computer-Human Interaction Conference, OZCHI 2006 (2006) 18. Arduino, http://www.arduino.cc 19. Greenberg, S., Fitchett, C.: Phidgets: easy development of physical interfaces through physical widgets. In: UIST 2001: Proceedings of the 14th annual ACM symposium on User interface software and technology, pp. 209–218. ACM, New York (2001) 20. Lee, J.C., Avrahami, D., Hudson, S.E., Forlizzi, J., Dietz, P.H., Leigh, D.: The calder toolkit: wired and wireless components for rapidly prototyping interactive devices. In: DIS 2004: Proceedings of the 5th conference on Designing interactive systems, pp. 167–175. ACM, New York (2004) 21. Ballagas, R., Ringel, M., Stone, M., Borchers, J.: iStuff: a physical user interface toolkit for ubiquitous computing environments. In: CHI 2003: Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 537–544. ACM, New York (2003) 22. Klemmer, S.R., Li, J., Lin, J., Landay, J.A.: Papier-Mache: toolkit support for tangible input. In: CHI 2004: Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 399–406. ACM, New York (2004) 23. Greenhalgh, C.I.S., Taylor, I.: ECT: A Toolkit to Support Rapid Construction of Ubicomp Environments. In: Ubicomp 2004 (2004) 24. Hong, D., Looser, J., Seichter, H., Billinghurst, M., Woo, W.: A Sensor-based Interaction for Ubiquitous Virtual Reality Systems. In: ISUVR 2008, Korea, pp. 75–78 (2008) 25. Dünser, A., Grasset, R., Billinghurst, M.: A Survey of Evaluation Techniques Used in Augmented Reality Studies. Technical Report TR-2008-02. HIT Lab NZ (2008) 26. Dünser, A., Grasset, R., Seichter, H., Billinghurst, M.: Applying HCI principles to AR systems design. In: MRUI 2007: 2nd International Workshop at the IEEE Virtual Reality 2007 Conference, Charlotte, North Carolina, USA (2007) 27. Swan, J.E., Gabbard, J.L.: Survey of User-Based Experimentation in Augmented Reality. In: 1st International Conference on Virtual Reality, Las Vegas, Nevada (2005)