Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3365610.3365626acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmumConference Proceedingsconference-collections
research-article

DialPlates: enabling pursuits-based user interfaces with large target numbers

Published: 26 November 2019 Publication History
  • Get Citation Alerts
  • Abstract

    In this paper we introduce a novel approach for smooth pursuits eye movement detection and demonstrate that it allows up to 160 targets to be distinguished. With this work we advance the well-established smooth pursuits technique, which allows gaze interaction without calibration. The approach is valuable for researchers and practitioners, since it enables novel user interfaces and applications to be created that employ a large number of targets, for example, a pursuits-based keyboard or a smart home where many different objects can be controlled using gaze. We present findings from two studies. In particular, we compare our novel detection algorithm based on linear regression with the correlation method. We quantify its accuracy for around 20 targets on a single circle and up to 160 targets on multiple circles. Finally, we implemented a pursuits-based keyboard app with 108 targets as proof-of-concept.

    References

    [1]
    Florian Alt, Stefan Schneegaß, Albrecht Schmidt, Jörg Müller, and Nemanja Memarovic. 2012. How to Evaluate Public Displays. In Proceedings of the 2012 International Symposium on Pervasive Displays (PerDis '12). ACM, New York, NY, USA, Article 17, 6 pages.
    [2]
    Marcus Carter, Eduardo Velloso, John Downs, Abigail Sellen, Kenton O'Hara, and Frank Vetere. 2016. PathSync: Multi-User Gestural Interaction with Touchless Rhythmic Path Mimicry. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 3415--3427.
    [3]
    Feridun M. Celebi, Elizabeth S. Kim, Quan Wang, Carla A. Wall, and Frederick Shic. 2014. A Smooth Pursuit Calibration Technique. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, New York, NY, USA, 377--378.
    [4]
    Christopher Clarke, Alessio Bellino, Augusto Esteves, and Hans Gellersen. 2017. Remote Control by Body Movement in Synchrony with Orbiting Widgets: An Evaluation of TraceMatch. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 3, Article 45 (Sept. 2017), 22 pages.
    [5]
    Christopher Clarke, Alessio Bellino, Augusto Esteves, Eduardo Velloso, and Hans Gellersen. 2016. TraceMatch: A Computer Vision Technique for User Input by Tracing of Animated Controls. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16). ACM, New York, NY, USA, 298--303.
    [6]
    Christopher Clarke and Hans Gellersen. 2017. MatchPoint: Spontaneous Spatial Coupling of Body Movement for Touchless Pointing. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST '17). ACM, New York, NY, USA, 179--192.
    [7]
    Dietlind Helene Cymek, Antje Christine Venjakob, Stefan Ruff, Otto Hans-Martin Lutz, Simon Hofmann, and Matthias Roetting. 2014. Entering PIN codes by smooth pursuit eye movements. Journal of Eye Movement Research 7, 4 (2014). https://bop.unibe.ch/index.php/JEMR/article/view/2384
    [8]
    Claudio de'Sperati and Paolo Viviani. 1997. The Relationship between Curvature and Velocity in Two-Dimensional Smooth Pursuit Eye Movements. Journal of Neuroscience 17, 10 (1997), 3932--3945. arXiv:http://www.jneurosci.org/content/17/10/3932.full.pdf
    [9]
    Heiko Drewes, Ken Pfeuffer, and Florian Alt. 2019. Time- and Space-efficient Eye Tracker Calibration. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications (ETRA '19). ACM, New York, NY, USA, Article 7, 8 pages.
    [10]
    Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the Computer Using Gaze Gestures. In Proceedings of the 11th IFIP TC 13 International Conference on Human-computer Interaction - Volume Part II (INTERACT'07). Springer-Verlag, Berlin, Heidelberg, 475--488. http://dl.acm.org/citation.cfm?id=1778331.1778385
    [11]
    Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches Using Smooth Pursuit Eye Movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 457--466.
    [12]
    Augusto Esteves, David Verweij, Liza Suraiya, Rasel Islam, Youryang Lee, and Ian Oakley. 2017. SmoothMoves: Smooth Pursuits Head Movements for Augmented Reality. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST '17). ACM, New York, NY, USA, 167--178.
    [13]
    Herlina, Sunu Wibirama, and Igi Ardiyanto. 2018. Similarity measures of object selection in interactive applications based on smooth pursuit eye movements. In 2018 International Conference on Information and Communications Technology (ICOIACT). 639--644.
    [14]
    Robert J. K. Jacob. 1990. What You Look at is What You Get: Eye Movement-based Interaction Techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '90). ACM, New York, NY, USA, 11--18.
    [15]
    Jari Kangas, Oleg Špakov, Poika Isokoski, Deepak Akkil, Jussi Rantala, and Roope Raisamo. 2016. Feedback for Smooth Pursuit Gaze Tracking Based Control. In Proceedings of the 7th Augmented Human International Conference 2016 (AH '16). ACM, New York, NY, USA, Article 6, 6:1--6:8 pages.
    [16]
    Mohamed Khamis, Florian Alt, and Andreas Bulling. 2015. A Field Study on Spontaneous Gaze-based Interaction with a Public Display Using Pursuits. In Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers (UbiComp/ISWC'15 Adjunct). ACM, New York, NY, USA, 863--872.
    [17]
    Mohamed Khamis, Axel Hoesl, Alexander Klimczak, Martin Reiss, Florian Alt, and Andreas Bulling. 2017. EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST '17). ACM, New York, NY, USA, 155--166.
    [18]
    Mohamed Khamis, Carl Oechsner, Florian Alt, and Andreas Bulling. 2018. VR-Pursuits: Interaction in Virtual Reality using Smooth Pursuit Eye Movements. In Proceedings of the 2018 International Conference on Advanced Visual Interfaces (AVI '18). ACM, New York, NY, USA, 7.
    [19]
    Mohamed Khamis, Ozan Saltuk, Alina Hang, Katharina Stolz, Andreas Bulling, and Florian Alt. 2016. TextPursuits: Using Text for Pursuits-based Interaction and Calibration on Public Displays. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16). ACM, New York, NY, USA, 274--285.
    [20]
    Mohamed Khamis, Ludwig Trotter, Markus Tessmann, Christina Dannhart, Andreas Bulling, and Florian Alt. 2016. EyeVote in the Wild: Do Users Bother Correcting System Errors on Public Displays?. In Proceedings of the 15th International Conference on Mobile and Ubiquitous Multimedia (MUM '16). ACM, New York, NY, USA, 57--62.
    [21]
    Mohamed Khamis, Ludwig Trotter, Markus Tessmann, Christina Dannhart, Andreas Bulling, and Florian Alt. 2016. EyeVote in the Wild: Do Users Bother Correcting System Errors on Public Displays?. In Proceedings of the 15th International Conference on Mobile and Ubiquitous Multimedia (MUM '16). ACM, New York, NY, USA, 57--62.
    [22]
    Otto Hans-Martin Lutz, Antje Christine Venjakob, and Stefan Ruff. 2015. SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements. Journal of Eye Movement Research 8, 1 (2015).
    [23]
    Jörg Müller, Florian Alt, Daniel Michelis, and Albrecht Schmidt. 2010. Requirements and Design Space for Interactive Public Displays. In Proceedings of the 18th ACM International Conference on Multimedia (MM '10). ACM, New York, NY, USA, 1285--1294.
    [24]
    Andriy Pavlovych and Carl Gutwin. 2012. Assessing Target Acquisition and Tracking Performance for Complex Moving Targets in the Presence of Latency and Jitter. In Proceedings of Graphics Interface 2012 (GI '12). Canadian Information Processing Society, Toronto, Ont., Canada, Canada, 109--116. http://dl.acm.org/citation.cfm?id=2305276.2305295
    [25]
    Ken Pfeuffer, Melodie Vidal, Jayson Turner, Andreas Bulling, and Hans Gellersen. 2013. Pursuit Calibration: Making Gaze Calibration Less Tedious and More Flexible. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST '13). ACM, New York, NY, USA, 261--270.
    [26]
    Thammathip Piumsomboon, Gun Lee, Robert W. Lindeman, and Mark Billinghurst. 2017. Exploring natural eye-gaze-based interaction for immersive virtual reality. In 2017 IEEE Symposium on 3D User Interfaces (3DUI). 36--39.
    [27]
    Vijay Rajanna, Adil Hamid Malla, Rahul Ashok Bhagat, and Tracy Hammond. 2018. DyGazePass: A gaze gesture-based dynamic authentication system to counter shoulder surfing and video analysis attacks. In 2018 IEEE 4th International Conference on Identity, Security, and Behavior Analysis (ISBA). 1--8.
    [28]
    Vijay Rajanna, Seth Polsley, Paul Taele, and Tracy Hammond. 2017. A Gaze Gesture-Based User Authentication System to Counter Shoulder-Surfing Attacks. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '17). ACM, New York, NY, USA, 1978--1986.
    [29]
    Eduardo Velloso, Marcus Carter, Joshua Newn, Augusto Esteves, Christopher Clarke, and Hans Gellersen. 2017. Motion Correlation: Selecting Objects by Matching Their Movement. ACM Trans. Comput.-Hum. Interact. 24, 3, Article 22 (April 2017), 35 pages.
    [30]
    Eduardo Velloso, Flavio Luiz Coutinho, Andrew Kurauchi, and Carlos H Morimoto. 2018. Circular Orbits Detection for Gaze Interaction Using 2D Correlation and Profile Matching Algorithms. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA '18). ACM, New York, NY, USA, Article 25, 9 pages.
    [31]
    Eduardo Velloso, Markus Wirth, Christian Weichel, Augusto Esteves, and Hans Gellersen. 2016. AmbiGaze: Direct Control of Ambient Devices by Gaze. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems (DIS '16). ACM, New York, NY, USA, 812--817.
    [32]
    Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '13). ACM, New York, NY, USA, 439--448.
    [33]
    Mélodie Vidal, Ken Pfeuffer, Andreas Bulling, and Hans W. Gellersen. 2013. Pursuits: Eye-based Interaction with Moving Targets. In CHI '13 Extended Abstracts on Human Factors in Computing Systems (CHI EA '13). ACM, New York, NY, USA, 3147--3150.
    [34]
    Oleg Špakov, Poika Isokoski, Jari Kangas, Deepak Akkil, and Päivi Majaranta. 2016. PursuitAdjuster: An Exploration into the Design Space of Smooth Pursuit -based Widgets. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (ETRA '16). ACM, New York, NY, USA, 287--290.
    [35]
    D. J. Ward and D. J. C. MacKay. 2002. Fast Hands-free writing by Gaze Direction. Nature 418, 6900 (2002), 838.

    Cited By

    View all

    Index Terms

    1. DialPlates: enabling pursuits-based user interfaces with large target numbers

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      MUM '19: Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia
      November 2019
      462 pages
      ISBN:9781450376242
      DOI:10.1145/3365610
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 26 November 2019

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. distinguishable pursuit targets
      2. gaze-only text entry
      3. linear regression
      4. smooth pursuit detection

      Qualifiers

      • Research-article

      Conference

      MUM 2019

      Acceptance Rates

      Overall Acceptance Rate 190 of 465 submissions, 41%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)38
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 27 Jul 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)LeyenesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103204184:COnline publication date: 1-Apr-2024
      • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
      • (2024)Design recommendations of target size and tracking speed under circular and square trajectories for smooth pursuit with Euclidean algorithm in eye-control systemDisplays10.1016/j.displa.2023.10260881(102608)Online publication date: Jan-2024
      • (2024)Research on a spatial–temporal characterisation of blink-triggered eye control interactionsAdvanced Engineering Informatics10.1016/j.aei.2023.10229759(102297)Online publication date: Jan-2024
      • (2024)SoundOrbit: motion-correlation interaction with auditory orbital trajectoriesPersonal and Ubiquitous Computing10.1007/s00779-024-01818-4Online publication date: 15-Jun-2024
      • (2023)EyePursuitLinks - an Eye-pursuit Based Interface for Web Browsing Using Smart TargetsProceedings of the 29th Brazilian Symposium on Multimedia and the Web10.1145/3617023.3617058(16-24)Online publication date: 23-Oct-2023
      • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
      • (2023)GazeCast: Using Mobile Devices to Allow Gaze-based Interaction on Public DisplaysProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589663(1-8)Online publication date: 30-May-2023
      • (2023)Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580871(1-17)Online publication date: 19-Apr-2023
      • (2022)One-handed Input for Mobile Devices via Motion Matching and Orbits ControlsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35346246:2(1-24)Online publication date: 7-Jul-2022
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media