Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2857491.2857514acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Wrist-worn pervasive gaze interaction

Published: 14 March 2016 Publication History

Abstract

This paper addresses gaze interaction for smart home control, conducted from a wrist-worn unit. First we asked ten people to enact the gaze movements they would propose for e.g. opening a door or adjusting the room temperature. On basis of their suggestions we built and tested different versions of a prototype applying off-screen stroke input. Command prompts were given to twenty participants by text or arrow displays. The success rate achieved by the end of their first encounter with the system was 46% in average; it took them 1.28 seconds to connect with the system and 1.29 seconds to make a correct selection. Their subjective evaluations were positive with regard to the speed of the interaction. We conclude that gaze gesture input seems feasible for fast and brief remote control of smart home technology provided that robustness of tracking is improved.

Supplementary Material

MP4 File (p57-hansen.mp4)

References

[1]
Akkil, D., Kangas, J., Rantala, J., Isokoski, P., Spakov, O. and Raisamo, R. 2015. Glance Awareness and Gaze Interaction in Smartwatches. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, ACM, 1271--1276.
[2]
Benedetto, S., Carbone, A., Pedrotti, M., Le Fevre, K., Bey, L. A. Y. and Baccino, T. 2015. Rapid serial visual presentation in reading: The case of Spritz. Computers in Human Behavior 45, 352--358.
[3]
Burge, M. J. and Bowyer, K. 2013. Handbook of iris recognition. Springer.
[4]
Daugman, J. G. 1993. High confidence visual recognition of persons by a test of statistical independence. Pattern Analysis and Machine Intelligence, IEEE Transactions on 15, 11, 1148--1161.
[5]
De Luca, A., Weiss, R. and Drewes, H. 2007. Evaluation of eye-gaze interaction methods for security enhanced PIN-entry. Proceedings of the 19th Australasian conference on computer-human interaction: Entertaining user interfaces, ACM, 199--202.
[6]
Drewes, H., De Luca, A. and Schmidt, A. 2007. Eye-gaze Interaction for Mobile Phones. Proceedings of the 4th International Conference on Mobile Technology, Applications, and Systems and the 1st International Symposium on Computer Human Interaction in Mobile Technology, ACM, 364--371.
[7]
Dybdal, M. L., San Agustin, J. and Hansen, J. P. 2012. Gaze Input for Mobile Devices by Dwell and Gestures. Proceedings of the Symposium on Eye Tracking Research and Applications, ACM, 225--228.
[8]
Esteves, A,. Velloso, E., Bulling, A. and Gellersen, H. 2015. Orbits: Gaze Interaction for Smart Watches Using Smooth Pursuit Eye Movements. Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, ACM, 457--466.
[9]
Hansen, J. P., Biermann, F., Madsen, J. A., Jonassen, M., Lund, H., San Agustin, J. and Sztuk, S. 2015. A gaze interactive textual smartwatch interface. Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers, ACM, 839--847.
[10]
Hansen, J. P., Biermann, F., Møllenbach, E., Lund, H., San Agustin, J. and Sztuk, S. 2015a. A GazeWatch Prototype. Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct, ACM, 615--621.
[11]
Holland, C. D. and Komogortsev, O. V. 2013. Complex eye movement pattern biometrics: Analyzing fixations and saccades. Biometrics (ICB), 2013 International Conference on, IEEE, 1--8.
[12]
Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H. and Van de Weijer, J. 2011. Eye tracking: A comprehensive guide to methods and measures. Oxford University Press.
[13]
Isokoski, P. 2000. Text input methods for eye trackers using off-screen targets. Proceedings of the 2000 symposium on Eye tracking research & applications, ACM, 15--21.
[14]
Kangas, J., Akkil, D., Rantala, J. Isokoski, P., Majaranta, P. and Raisamo, R. 2014. Gaze Gestures and Haptic Feedback in Mobile Devices. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, 435--438.
[15]
Marquardt, N., Ballendat, T., Boring, S., Greenberg, S. and Hinckley, K. 2012. Gradual engagement: facilitating information exchange between digital devices as a function of proximity. Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces, ACM, 31--40.
[16]
Møllenbach, E., Hansen, J. P. and Lillholm, M. 2013. Eye movements in gaze interaction. Journal of Eye Movement Research 6, 2, 1--1.
[17]
Rawassizadeh, R., Price, B. A. and Petre, M. 2014. Wearables: has the age of smartwatches finally arrived? Communications of the ACM 58, 1, 45--47.
[18]
Rozado, D., Moreno, T., San Agustin, J., Rodriguez, F. B. and Varona, P. 2015. Controlling a Smartphone Using Gaze Gestures As the Input Mechanism. Hum.-Comput. Interact. 30, 1, 34--63.
[19]
Shell, J. S., Vertegaal, R. and Skaburskis, A. W. 2003. EyePliances: Attention-seeking Devices That Respond to Visual Attention. CHI '03 Extended Abstracts on Human Factors in Computing Systems, ACM, 770--771.

Cited By

View all
  • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
  • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
  • (2023)Universal Design of Gaze Interactive Applications for People with Special NeedsProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589666(1-7)Online publication date: 30-May-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '16: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications
March 2016
378 pages
ISBN:9781450341257
DOI:10.1145/2857491
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 March 2016

Permissions

Request permissions for this article.

Check for updates

Badges

  • Best Paper

Author Tags

  1. gaze tracking
  2. hands-free interfaces
  3. input
  4. mobility
  5. pervasive technology
  6. security and access systems
  7. smart home
  8. smartwatch
  9. ubiquitous computing

Qualifiers

  • Research-article

Conference

ETRA '16
ETRA '16: 2016 Symposium on Eye Tracking Research and Applications
March 14 - 17, 2016
South Carolina, Charleston

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)9
  • Downloads (Last 6 weeks)4
Reflects downloads up to 01 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
  • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
  • (2023)Universal Design of Gaze Interactive Applications for People with Special NeedsProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589666(1-7)Online publication date: 30-May-2023
  • (2023)Study on the brightness and graphical display object directions of the Single-Gaze-Gesture user interfaceDisplays10.1016/j.displa.2023.10253780(102537)Online publication date: Dec-2023
  • (2022)GazeDock: Gaze-Only Menu Selection in Virtual Reality using Auto-Triggering Peripheral Menu2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR51125.2022.00105(832-842)Online publication date: Mar-2022
  • (2022)“I Gave up Wearing Rings:” Insights on the Perceptions and Preferences of Wheelchair Users for Interactions With WearablesIEEE Pervasive Computing10.1109/MPRV.2022.315595221:3(92-101)Online publication date: 1-Jul-2022
  • (2021)Exploring Social Acceptability and Users’ Preferences of Head- and Eye-Based Interaction with Mobile DevicesProceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia10.1145/3490632.3490636(12-23)Online publication date: 5-Dec-2021
  • (2019)Inducing gaze gestures by static illustrationsProceedings of the 11th ACM Symposium on Eye Tracking Research & Applications10.1145/3317956.3318151(1-5)Online publication date: 25-Jun-2019
  • (2019)Gaze-based interactions in the cockpit of the future: a surveyJournal on Multimodal User Interfaces10.1007/s12193-019-00309-814:1(25-48)Online publication date: 19-Jul-2019
  • (2018)A gaze interactive assembly instruction with pupillometric recordingBehavior Research Methods10.3758/s13428-018-1074-z50:4(1723-1733)Online publication date: 6-Jul-2018
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media