Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1983302.1983304acmotherconferencesArticle/Chapter ViewAbstractPublication PagesngcaConference Proceedingsconference-collections
research-article

Mobile gaze-based screen interaction in 3D environments

Published: 26 May 2011 Publication History

Abstract

Head-mounted eye trackers can be used for mobile interaction as well as gaze estimation purposes. This paper presents a method that enables the user to interact with any planar digital display in a 3D environment using a head-mounted eye tracker. An effective method for identifying the screens in the field of view of the user is also presented which can be applied in a general scenario in which multiple users can interact with multiple screens. A particular application of using this technique is implemented in a home environment with two big screens and a mobile phone. In this application a user was able to interact with these screens using a wireless head-mounted eye tracker.

References

[1]
Agustin, J. S., Hansen, J. P., Tall, M. 2010. Gaze-Based Interaction with Public Displays Using Off-the-Shelf Components. In Proceedings of the 12th ACM International Conference on Ubiquitous Computing (UBICOMP2010), Copenhagen, Denmark. ACM, New York, pp. 377--378.
[2]
Bolt, R. A. 1982. Eyes at the Interface. In Proc. of Human Factors in Computer Systems Conference, 360--362.
[3]
Bonino, D.; Castellina, E., Corno, F. & Garbo, A. 2006. Control Application for Smart Housethrough Gaze interaction," Proceedings of the 2nd COGAIN Annual Conference on Communication by Gaze Interaction, Turin, Italy.
[4]
Bonino, D., Castellina, E., & Corno, F. 2008. The DOG gateway: enabling ontologybasedintelligent domotic environments. Consumer Electronics, IEEE Transactions on, 54 (4), 1656--1664.
[5]
Castellina, E., Razzak, F., Corno, F. 2009. "Environmental Control Application. Compliant with Cogain Guidelines," The 5h Conference on Communication by gaze interaction (COGAIN 2009).
[6]
Duchowski, A. T. 2007. Eye Tracking Methodology: Theory and Practice. Springer, London. (2th edn)
[7]
Duchowski, A. T., Cournia, N., and Murphy, H. 2004. Gaze-contingent displays: A Review. CyberPsychology and Behaviour, 7(6), 621--634.
[8]
Duchowski, A. T. 2002. A breadth-first survey of eye tracking applications. In Behavior Research Methods, Instruments, & Computers (BRMIC), 34(4), 455--470.
[9]
Eaddy, M., Blasko, G., Babcock, J., and Feiner, S. 2004. My own private kiosk: Privacy-preserving public displays. In ISWC '04: Proceedings of the Eighth International Symposium on Wearable Computers, 132--135, Washington, DC, USA, IEEE Computer Society.
[10]
Gale A. G., 2005. Attention Responsive Technology and Ergonomics. In Bust P. D. & McCabe P. T. (Eds.) Contemporary Ergonomics 2005, London, Taylor and Francis, 273--276.
[11]
Hansen, D. W. and Ji, Q. 2010. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE Trans. Pattern Anal. Mach. Intell, 478--500.
[12]
Hartley, R., and Zisserman, A. 2000. Multiple view geometry in computer vision. Cambridge University Press, Cambridge, UK.
[13]
Hutchinson, T. E., White, K. P., Martin, W. N., Reichert, K. C., and Frey, L. A. 1989. Human-computer interaction using eye-gaze input. Systems, Man and Cybernetics, IEEE Transactions on, 19(6), 1527--1534.
[14]
Hyrskykari, A., Majaranta, P., and Raiha, K. J. 2005. From gaze control to attentive interfaces. In Proceedings of the 11th International Conference on Human-Computer Interaction (HCII 2005). IOS Press.
[15]
Kato, H., and Billinghurst, M. 1999. Marker tracking and hmd calibration for a video-based augmented reality conferencing system. In Proc. IEEE International Workshop on Augmented Reality, 125--133.
[16]
Koike, H., Nishikawa, W., Fukuchi, K. 2009. Transparent 2-D Markers on an LCD Tabletop System, ACM Human Factors in Computing Systems (CHI 2009), 163--172.
[17]
Palmer, R. C. 2001. The Barcode Book, 4th edition, Helmers Pub.
[18]
Park, H., and Park, J. 2004. Invisible marker tracking for AR. Proc. 3rd IEEE/ACM Int. Symp. on Mixed and Augmented Reality, 272--273.
[19]
Shell, J. S., Vertegaal, R., and Skaburskis, A. W. 2003. Eyepliances: attention-seeking devices that respond to visual attention. In CHI '03: Extended abstracts on Human factors in computing systems, pages 770--771, New York, NY, USA. ACM Press.
[20]
Shi, F., Gale, A. G. & Purdy, K. J. 2006. Eye-centric ICT control. In Bust P. D. & McCabe P. T. (Eds.) Contemporary Ergonomics 2006, 215--218.
[21]
Ware, C. and Mikaelian, H. T. 1987. An evaluation of an eye tracker as a device for computer input. In Proc. of the ACM CHI + GI-87 Human Factors in Computing Systems Conference, 183--188.

Cited By

View all
  • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
  • (2024)GazePointAR: A Context-Aware Multimodal Voice Assistant for Pronoun Disambiguation in Wearable Augmented RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642230(1-20)Online publication date: 11-May-2024
  • (2022)Research on the Multi-Screen Connection Interaction Method Based on Regular Octagon K-Value Template MatchingSymmetry10.3390/sym1408152814:8(1528)Online publication date: 26-Jul-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
NGCA '11: Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
May 2011
71 pages
ISBN:9781450306805
DOI:10.1145/1983302
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

  • School of Computing, BTH: Blekinge Institute of Technology - School of Computing
  • Tobii

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 May 2011

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. domotics
  2. gaze-based interaction
  3. head-mounted eye tracker
  4. screen interaction

Qualifiers

  • Research-article

Conference

NGCA '11
Sponsor:
  • School of Computing, BTH

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)14
  • Downloads (Last 6 weeks)1
Reflects downloads up to 03 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
  • (2024)GazePointAR: A Context-Aware Multimodal Voice Assistant for Pronoun Disambiguation in Wearable Augmented RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642230(1-20)Online publication date: 11-May-2024
  • (2022)Research on the Multi-Screen Connection Interaction Method Based on Regular Octagon K-Value Template MatchingSymmetry10.3390/sym1408152814:8(1528)Online publication date: 26-Jul-2022
  • (2020)Eye Tracking for Target Acquisition in Sparse VisualizationsACM Symposium on Eye Tracking Research and Applications10.1145/3379156.3391834(1-5)Online publication date: 2-Jun-2020
  • (2020)Enhancing Mobile Voice Assistants with WorldGazeProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376479(1-10)Online publication date: 21-Apr-2020
  • (2019)THE-3DI: Tracing head and eyes for 3D interactionsMultimedia Tools and Applications10.1007/s11042-019-08305-6Online publication date: 30-Oct-2019
  • (2018)An interactive classical VR concert featuring multiple viewsProceedings of the Second African Conference for Human Computer Interaction: Thriving Communities10.1145/3283458.3283501(1-4)Online publication date: 3-Dec-2018
  • (2018)3D gaze estimation in the scene volume with a head-mounted eye trackerProceedings of the Workshop on Communication by Gaze Interaction10.1145/3206343.3206351(1-9)Online publication date: 15-Jun-2018
  • (2018)AnyorbitProceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications10.1145/3204493.3209579(1-5)Online publication date: 14-Jun-2018
  • (2018)AnyorbitProceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications10.1145/3204493.3204555(1-5)Online publication date: 14-Jun-2018
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media