Abstract
The vision of sensor-driven applications that adapt to the environment hold great promise, but it is difficult to turn these applications into reality because device and space heterogeneity is an obstacle to interoperability and mutual understanding of the smart devices and spaces involved. Smart Spaces provide shared knowledge about physical domains and they inherently enable cooperative and adaptable applications by keeping track of the semantic relations between objects in the environment. In this paper, the interplay between sensor-driven objects and Smart Spaces is investigated and a device with a tangible interface demonstrates the potential of the \({\sl smart{\text -}space{\text -}based}\) and \({\sl sensor{\text -}driven}\) computing paradigm. The proposed device is named REGALS (Reconfigurable Gesture based Actuator and Low Range Smartifier). We show how, starting from an interaction model proposed by Niezen, REGALS can reconfigure itself to support different functions like Smart Space creation (also called \({\sl environment\,smartification}\)), interaction with heterogeneous devices and handling of semantic connections between gestures, actions, devices, and objects. This reconfiguration ability is based on the context received from the Smart Space. The paper also shows how tagged objects and natural gestures are recognized to improve the user experience reporting a use case and the performance evaluation of REGALS’ gesture classifier.













Similar content being viewed by others
Notes
This research is being developed within the framework of the SOFIA project of the European Joint Undertaking on Embedded Systems ARTEMIS http://www.sofia-project.eu.
References
Weiser M (1999) The computer for the 21st century. SIGMOBILE Mob Comput Commun Rev 3(3):3–11
Monekosso DN, Remagnino P, Kuno Y (2009) Intelligent environments: methods, algorithms and applications. In: Monekosso D, Kuno Y (eds) Advanced information and knowledge processing, 1st edn. Springer, Berlin, p 211. http://www.springer.com/computer/ai/book/978-1-84800-345-3
Lopez T, Ranasinghe D, Patkai B, McFarlane D (2009) Taxonomy, technology and applications of smart objects. Inform Syst Front 1387(3326):1–20
Bartolini S, Roffia L, Salmon Cinotti T, Manzaroli D, Spadini F, DElia A, Vergari F, Zamagni G, Di Stefano L, Franchi A, Farella E, Zappi P, Costanzo A, Montanari E (2010) Creazione automatica di ambienti intelligenti. Patent Pending, March 2010, BO201A000117
Schmidt A (2000) Implicit human computer interaction through context. Pers Ubiquit Comput 4(2):191–199
Ryan N (2005) Smart environments for cultural heritage. In: Takao UNO (ed) Reading historical spatial information from around the world: studies of culture and civilization based on geographic information systems data
Salmeri A, Licciardi CA, Lamorte L, Valla M, Giannantonio R, Sgroi M (2009) An architecture to combine context awareness and body sensor networks for health care applications. International Conference on Smart Homes and Health Telematics. Springer, Berlin, pp 90–97
Dey A, Salber D, Abowd G (2001) A conceptual framework and a toolkit for supporting the rapid prototyping of context-aware applications. Hum Compt Interact 16(2, 3 and 4):97–166
Honkola J, Laine H, Brown R, Oliver I (2009) Cross-domain interoperability: a case study. In: International conference on smart spaces and next generation wired/wireless networking and 2 conference on smart spaces (NEW2AN09 and ruSMART09), pp 22–31
Schilit WN (1995) A system architecture for context-aware mobile computing. PhD thesis, Columbia University
Overview of SGML resources: http://www.w3.org/MarkUp/SGML
Ranganathan A, Campbell RH (2003) A middleware for context-aware agents in ubiquitous computing environments. In: Endler M, Schmidt DC (eds) Middleware, vol 2672 of lecture notes in computer science, pp 143–161
Gruber TR (1993) A translation approach to portable ontology specifications. Knowl Acquisition 5(2):199–220
Smart-M3 public source code: http://sourceforge.net/projects/smart-m3/
Mantyjarvi J, Paterna F, Salvador Z, Santoro C (2006) Scan and tilt: towards natural interaction for mobile museum guides. In: Conference on human-computer interaction with mobile devices and services, pp 191–194
Pering T, Ballagas R, Want R (2005) Spontaneous marriages of mobile devices and interactive spaces. Commun ACM 48(9):53–59
Kranz M, Holleis P, Schmidt A (2010) Embedded interaction: interacting with the internet of things. IEEE Internet Comput 14(2):46–53
Pan G, Wu J, Zhang D, Wu Z, Yang Y, Li S (2010) GeeAir: a universal multimodal remote control device for home appliances. Per Ubiquit Comput 14(8):723–735
Niezen G, Van der Vlist B, Hu J, Feijs L (2010) From events to goals: supporting semantic interaction in smart environments. In: The IEEE symposium on computers and communications, pp 1029–1034
Franchi A, Di Stefano L, Cinotti TS (2010) Mobile visual search using Smart-M3. In: The IEEE symposium on computers and communications, pp 1065–1070
Van der Vlist B, Niezen G, Hu J, Feijs L (2010) Semantic connections: exploring and manipulating connections in smart spaces. IEEE Symp Comput Commun 1−4
Vergari F, Bartolini S, Spadini F, D’Elia A, Zamagni G, Roffia L, Cinotti TS (2010) A smart space application to dynamically relate medical and environmental Information. In: Design Automation & Test in Europe (DATE10), pp 1542–1547
Horrocks I, Kutz O, Sattler U (2006) The even more irresistible SROIQ. In: International conference of knowledge representation and reasoning, pp 57–67
http://www.gumstix.com/store/catalog/product_info.php?products_id=210
Kim L, Cho H, Park SH, Han M (2007) A tangible user interface with multimodal feedback. In: International conference on human-computer interaction, pp 94–103
Bailador G, Roggen D, Trster G, Trivino G (2007) Real time gesture recognition using continuous time recurrent neural networks. In: International conference on body area networks (BodyNets), Article n15
Kela J, Korpip P, Mntyjrvi J, Kallio S, Savino G, Jozzo L, Marca D (2006) Accelerometer-based gesture control for a design environment. Personal Ubiquitous Comput 10(5):285–299
Mantyla V-M, Mantyjarvi J, Seppanen T, Tuulari E (2000) Hand gesture recognition of a mobile device user. IEEE Int Conf MultiMed Expo 1(c):281–284
Zappi P, Milosevic B, Farella E, Benini L (2009) Hidden Markov model based gesture recognition on low-cost, low-power tangible user interfaces. Entertain Comput 1(2):75–84
Hofmann FG, Heyer P, Hommel G (1997) Velocity profile based recognition of dynamic gestures with discrete Hidden Markov models. In: Gesture and sign language in human-computer interaction, international gesture workshop. Springer, Berlin, pp 81–95. http://www.springerlink.com/content/wju4v16208336502/about/
Chambers GS, Venkatesh S, West GA, Bui HH (2004) Segmentation of intentional human gestures for sports video annotation. In: International Multimedia Modelling Conference, pp 124–130
Amstutz R, Amft O, French B, Smailagic A, Siewiorek D, Troster G (2009) Performance analysis of an HMM-based gesture recognition using a wristwatch device. Int Conf Comput Sci Eng 02:303–309
Milosevic B, Farella E, Benini L (2010) Continuous gesture recognition for resource constrained smart objects. In: Proceedings of the fourth international conference on mobile ubiquitous computing, systems, services and technologies, UBICOMM 2010. pp 391–396
Acknowledgments
The Authors would like to thank Luca Faggianelli for his work and assistance on the realization of the REGALS prototype. This work was carried out within the framework of a project of the European Joint Undertaking on Embedded Systems ARTEMIS. The project is called SOFIA (2009–2011), it is coordinated by NOKIA and it is co-funded by the EU and by National Authorities including MIUR, the Italian Central Authority for Education and Research.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Bartolini, S., Milosevic, B., D’Elia, A. et al. Reconfigurable natural interaction in smart environments: approach and prototype implementation. Pers Ubiquit Comput 16, 943–956 (2012). https://doi.org/10.1007/s00779-011-0454-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00779-011-0454-5