Abstract
This paper presents a framework for controlling remote applications by means of personalized multi-touch interfaces. The designed framework allows end-users to fully personalize the mapping between gestures and input commands. A two-tier architecture has been developed. A formal description of the original interface is automatically generated at the server side to identify a set of available actions for controlling existing applications. The client is in charge of loading the description of the target application, allowing the user to shape the preferred mapping between gestures and actions. Finally, the server converts the identified actions into one or more commands understandable by the original computer interface. The implementation of the system for this work specifically relies on handheld multi-touch devices. Test results are encouraging, both from an objective and a subjective point of view; indeed, the designed framework resulted to outperform a traditional GUI both in terms of number of actions to perform a task and average completion time.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Pavlovi’c, V.I., Sharma, R., Huang, T.S.: Visual interpretation of hand gestures for human-computer interaction: a review. IEEE TPAMI 19, 677–695 (1997)
Pavlovi’c, V.I., Sharma, R., Huang, T.S.: Gestural interface to a visual computing environment for molecular biologists. In: Proc. of the 2nd Intern. Conf. on Automatic Face and Gesture Recognition, pp. 52–73. IEEE Computer Society, Los Alamitos (1996)
Selker, T.: Touching the future. Commun. ACM 51, 14–16 (2008)
Wright, A.: Making sense of sensors. Commun. ACM 52, 14–15 (2009)
Seifried, T., Rendl, C., Perteneder, F., Leitner, J., Haller, M., Sakamoto, D., Kato, J., Inami, M., Scott, S.D.: CRISTAL, control of remotely interfaced systems using touch-based actions in living spaces. In: SIGGRAPH 2009 Emerging Technologies, N.Y. (2009)
Gong, J., Tarasewich, P.: Guidelines for Handheld Mobile Device Interface Design. In: Proc. Decision Sciences Inst., Decision Sciences Inst., pp. 3751–3756 (2004)
Florins, M., Vanderdonckt, J.: Graceful Degradation of User Interfaces as a Design Method for Multiplatform Systems. In: Proc. 9th ACM Int’l Conf. IUI 2004, pp. 140–147 (2004)
Lamberti, F., Sanna, A.: Extensible GUIs for Remote Application Control on Mobile Devices. IEEE Computer Graphics and Applications 28(4), 50–57 (2008)
Hancock, M., Carpendale, S., Cockburn, A.: Shallow-depth 3D interaction: design and evaluation of one, two and threetouch techniques. In: CHI 2007: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1147–1156. ACM, N.Y (2007)
Wilson, A.D., Izadi, S., Hilliges, O., Garcia-Mendoza, A., Kirk, D.: Bringing physics to the surface. In: UIST 2008: Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology, pp. 67–76. ACM, New York (2008)
Hafeneger, S., Weiss, M., Herkenrath, G., Borchers, J.: Pockettable: Mobile devices as multi-touch controllers for tabletop application development. Extended Abstracts of Tabletop 2008 (2008)
Nestler, S., Echtler, F., Dollinger, A., Klinker, G.: Collaborative problem solving on mobile hand-held devices and stationary multi-touch interfaces. In: PPD 2008: Workshop on Designing Multitouch Interaction Techniques for Coupled Public and Private Displays (2008)
Shen, E.L., Tsai, S.S., Chu, H.H., Hsu, J., Chen, C.W.: Double-side multi-touch input for mobile devices. In: CHI 2009: Proceedings of the SIGCHI. ACM, New York (2009)
Wigdor, D., Leigh, D., Forlines, C., Shipman, S., Barnwell, J., Balakrishnan, R., Shen, C.: Under the table interaction. In: UIST 2006: Proc. of the 19th Annual ACM Symposium on User Interface Software and Technology, pp. 259–268. ACM, New York (2006)
RedEye, http://thinkflood.com/products/redeye/what-is-redeye/
Geltz, B.R., Berlier, J.A., McCollum, J.M.: Using the iPhone and iPod Touch for remote sensor control and data acquisition. In: IEEE Proc. of the SoutheastCon, pp. 9–12 (2010)
Sparsh-UI, http://code.google.com/p/sparsh-ui/
Cheng, K., Itzstein, B., Sztajer, P., Rittenbruch, P.: A unified multi-touch & multi-pointer software architecture for supporting collocated work on the desktop. Technical Report ATP-2247. NICTA, Australian Technology Park, Sydney, Australia (2009)
Limbourg, Q., Vanderdonckt, J., Michotte, B., Bouillon, L., López-Jaquero, V.: USIXML: A Language Supporting Multi-path Development of User Interfaces. In: Feige, U., Roth, J. (eds.) DSV-IS 2004 and EHCI 2004. LNCS, vol. 3425, pp. 200–220. Springer, Heidelberg (2005)
Kaltenbrunner, M., Bovermann, T., Bencina, R., Costanza, E.: TUIO: A Protocol for Table-Top Tangible User Interfaces. In: 6th International Gesture Workshop (2005)
Fiorella, D., Sanna, A., Lamberti, F.: Multi-touch user interface evaluation for 3D object manipulation on mobile devices. Journal on Multimodal User Interfaces (2009)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 ICST Institute for Computer Science, Social Informatics and Telecommunications Engineering
About this paper
Cite this paper
Paravati, G., Donna Bianco, M., Sanna, A., Lamberti, F. (2012). A Multi-touch Solution to Build Personalized Interfaces for the Control of Remote Applications. In: Alvarez, F., Costa, C. (eds) User Centric Media. UCMEDIA 2010. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 60. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-35145-7_2
Download citation
DOI: https://doi.org/10.1007/978-3-642-35145-7_2
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-35144-0
Online ISBN: 978-3-642-35145-7
eBook Packages: Computer ScienceComputer Science (R0)