Abstract
Larger tablet computers are not always easy to use in handheld configurations. Gaze input and especially gaze gestures provide an alternative input technology in such situations. We investigated the task performance and user experience in gaze gesture use when haptic feedback was provided either to fingers touching the tablet or behind the ears through the eyeglass frame. The participant’s task was to look at a display and complete simple two-stroke gaze gestures consisting of either one, two, or three repetitions. The results showed that the participants found feedback on both body locations to be equally pleasant and preferred haptic feedback to no feedback. Also, the participants favored feedback that was spatially congruent with gaze movement.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Borg, E., Rönnber, J., Neovius, L.: Vibratory-coded directional analysis: Evaluation of a three-microphone/four-vibrator DSP system. J. Rehabil. Res. Dev. 38(2), 257–263 (2001)
Dobrzynski, M.K., Mejri, S., Wischmann, S., Floreano, D.: Quantifying information transfer through a head-attached vibrotactile display: principles for design and control. IEEE Trans. Biomed. Eng. 59(7), 2011–2018 (2012)
Drewes, H., De Luca, A., Schmidt, A.: Eye-gaze interaction for mobile phones. In: Proceedings of the 4th International Conference on Mobile Technology, Applications, and Systems and the 1st International Symposium on Computer Human Interaction in Mobile Technology, Mobility 2007, pp. 364–371. ACM, New York (2007). http://doi.acm.org/10.1145/1378063.1378122
Dybdal, M.L., Agustin, J.S., Hansen, J.P.: Gaze input for mobile devices by dwell and gestures. In: Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 225–228. ACM, New York (2012). http://doi.acm.org/10.1145/2168556.2168601
Hyrskykari, A., Istance, H., Vickers, S.: Gaze gestures or dwell-based interaction? In: Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 229–232. ACM, New York (2012). http://doi.acm.org/10.1145/2168556.2168602
Istance, H., Hyrskykari, A., Immonen, L., Mansikkamaa, S., Vickers, S.: Designing gaze gestures for gaming: an investigation of performance. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, ETRA 2010, pp. 323–330. ACM, New York (2010). http://doi.acm.org/10.1145/1743666.1743740
Kangas, J., Akkil, D., Rantala, J., Isokoski, P., Majaranta, P., Raisamo, R.: Gaze gestures and haptic feedback in mobile devices. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2014, pp. 435–438. ACM, New York (2014). http://doi.acm.org/10.1145/2556288.2557040
Lukander, K., Jagadeesan, S., Chi, H., Müller, K.: Omg!: A new robust, wearable and affordable open source mobile gaze tracker. In: Proceedings of the 15th International Conference on Human-computer Interaction with Mobile Devices and Services, MobileHCI 2013, pp. 408–411. ACM, New York (2013). http://doi.acm.org/10.1145/2493190.2493214
Myles, K., Kalb, J.T.: Guidelines for head tactile communication. Technical report ARL-TR-5116. Army Research Laboratory, Aberdeen Proving Ground (2010)
Nagamatsu, T., Yamamoto, M., Sato, H.: Mobigaze: development of a gaze interface for handheld mobile devices. In: Ext. Abstracts CHI 2010, Atlanta, Georgia, USA, pp. 3349–3354. ACM, New York (2010). http://doi.acm.org/10.1145/1753846.1753983, ISBN: 978-1-60558-930-5
Rantala, J., Kangas, J., Akkil, D., Isokoski, P., Raisamo, R.: Glasses with haptic feedback of gaze gestures. In: Proceedings of the Extended Abstracts of the 32nd Annual ACM Conference on Human Factors in Computing Systems, CHI EA 2014, Toronto, Ontario, Canada, pp. 1597–1602. ACM, New York (2014). http://doi.acm.org/10.1145/2559206.2581163, ISBN: 978-1-4503-2474-8
Rozado, D., Moreno, T., San Agustin, J., Rodriguez, F.B., Varona, P.: Controlling a smartphone using gaze gestures as the input mechanism. Hum. Comput. Interact. 30(1), 34–63 (2015). http://dx.doi.org/10.1080/07370024.2013.870385
Acknowledgments
This work was funded by the Academy of Finland, projects Haptic Gaze Interaction (decision numbers 260026 and 260179) and Mind Picture Image (decision number 266285).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Kangas, J., Rantala, J., Akkil, D., Isokoski, P., Majaranta, P., Raisamo, R. (2016). Both Fingers and Head are Acceptable in Sensing Tactile Feedback of Gaze Gestures. In: Bello, F., Kajimoto, H., Visell, Y. (eds) Haptics: Perception, Devices, Control, and Applications. EuroHaptics 2016. Lecture Notes in Computer Science(), vol 9775. Springer, Cham. https://doi.org/10.1007/978-3-319-42324-1_10
Download citation
DOI: https://doi.org/10.1007/978-3-319-42324-1_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-42323-4
Online ISBN: 978-3-319-42324-1
eBook Packages: Computer ScienceComputer Science (R0)