Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Guiding gaze gestures on smartwatches: : Introducing fireworks

Published: 14 March 2024 Publication History
  • Get Citation Alerts
  • Abstract

    Smartwatches enable interaction anytime and anywhere, with both digital and augmented physical objects. However, situations with busy hands can prevent user inputs. To address this limitation, we propose Fireworks, an innovative hands-free alternative that empowers smartwatch users to trigger commands effortlessly through intuitive gaze gestures by providing post-activation guidance. Fireworks allows command activation by guiding users to follow targets moving from the screen center to the edge, mimicking real life fireworks. We present the experimental design and evaluation of two Fireworks instances. The first design employs temporal parallelization, displaying few dynamic targets during microinteractions (e.g., snoozing a notification while cooking). The second design sequentially displays targets to support more commands (e.g., 20 commands), ideal for various scenarios other than microinteractions (e.g., turn on lights in a smart home). Results show that Fireworks’ single straight gestures enable faster and more accurate command selection compared to state-of-the-art baselines, namely Orbits and Stroke. Additionally, participants expressed a clear preference for Fireworks’ original visual guidance.

    Highlights

    For guiding few gestures, a system-paced guidance is not necessary.
    Pursuits are more adapted than saccades for Single Straight Gaze Gestures.
    Delay and visual feedforward to activate Fireworks do not impact performances.
    Fireworks can be used with long and short words, without constraining command names.
    A Rapid Serial Visual Presentation can lead to errors depending on the content nature.

    References

    [1]
    Almoctar H., Irani P., Peysakhovich V., Hurter C., Path Word: A Multimodal password entry method for ad-hoc authentication based on digits’ shape and smooth pursuit eye movements, in: Proceedings of the 20th ACM International Conference on Multimodal Interaction, in: ICMI ’18, Association for Computing Machinery, Boulder, CO, USA, 2018, pp. 268–277,.
    [2]
    Bangor A., Kortum P., Miller J., Determining what individual SUS scores mean: Adding an adjective rating scale, J. Usability Stud. 4 (3) (2009) 114–123.
    [3]
    Cymek D.H., Venjakob A.C., Ruff S., Lutz O.H.-M., Hofmann S., Roetting M., Entering PIN codes by smooth pursuit eye movements, J. Eye Mov. Res. 7 (4) (2014),.
    [4]
    Delamare W., Coutrix C., Nigay L., Designing guiding systems for gesture-based interaction, in: Proceedings of the 7th ACM SIGCHI Symposium on Engineering Interactive Computing Systems, in: EICS ’15, Association for Computing Machinery, Duisburg, Germany, 2015, pp. 44–53,.
    [5]
    Delamare W., Han T., Irani P., Designing a gaze gesture guiding system, in: Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services, in: MobileHCI ’17, Association for Computing Machinery, Vienna, Austria, 2017, pp. 1–13,.
    [6]
    Drewes H., Khamis M., Alt F., DialPlates: Enabling pursuits-based user interfaces with large target numbers, in: Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia, in: MUM ’19, Association for Computing Machinery, Pisa, Italy, 2019, pp. 1–10,.
    [7]
    Drewes H., Schmidt A., Interacting with the computer using gaze gestures, in: Proceedings of the 11th IFIP TC 13 International Conference on Human-Computer Interaction - Volume Part II, in: INTERACT’07, Springer-Verlag, Rio de Janeiro, Brazil, 2007, pp. 475–488.
    [8]
    Dybdal M.L., Agustin J.S., Hansen J.P., Gaze input for mobile devices by dwell and gestures, in: Proceedings of the Symposium on Eye Tracking Research and Applications, in: ETRA ’12, Association for Computing Machinery, Santa Barbara, California, 2012, pp. 225–228,.
    [9]
    Esteves A., Velloso E., Bulling A., Gellersen H., Orbits: gaze interaction for smart watches using smooth pursuit eye movements, in: Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology - UIST ’15, ACM Press, New York, New York, USA, 2015, pp. 457–466,.
    [10]
    Esteves A., Verweij D., Suraiya L., Islam R., Lee Y., Oakley I., SmoothMoves: Smooth pursuits head movements for augmented reality, in: Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, in: UIST ’17, Association for Computing Machinery, New York, NY, USA, 2017, pp. 167–178,.
    [11]
    Hansen J.P., Biermann F., Madsen J.A., Jonassen M., Lund H., Agustin J.S., Sztuk S., A gaze interactive textual smartwatch interface, in: Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers, in: UbiComp/ISWC’15 Adjunct, Association for Computing Machinery, Osaka, Japan, 2015, pp. 839–847,.
    [12]
    Hansen J.P., Lund H., Biermann F., Møllenbach E., Sztuk S., Agustin J.S., Wrist-worn pervasive gaze interaction, in: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, in: ETRA ’16, Association for Computing Machinery, Charleston, South Carolina, 2016, pp. 57–64,.
    [13]
    Hyrskykari A., Istance H., Vickers S., Gaze gestures or dwell-based interaction?, in: Proceedings of the Symposium on Eye Tracking Research and Applications, in: ETRA ’12, Association for Computing Machinery, Santa Barbara, California, 2012, pp. 229–232,.
    [14]
    Isokoski P., Text input methods for eye trackers using off-screen targets, in: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, in: ETRA ’00, Association for Computing Machinery, Palm Beach Gardens, Florida, USA, 2000, pp. 15–21,.
    [15]
    Isomoto T., Yamanaka S., Shizuki B., Gaze-based Command Activation Technique Robust Against Unintentional Activation using Dwell-then-Gesture, 2020.
    [16]
    Istance H., Bates R., Hyrskykari A., Vickers S., Snap clutch, a moded approach to solving the Midas touch problem, in: Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, in: ETRA ’08, Association for Computing Machinery, Savannah, Georgia, 2008, pp. 221–228,.
    [17]
    Istance H., Hyrskykari A.I., Supporting making fixations and the effect on gaze gesture performance, in: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, in: CHI ’17, Association for Computing Machinery, Denver, Colorado, USA, 2017, pp. 3022–3033,.
    [18]
    Istance H., Hyrskykari A., Immonen L., Mansikkamaa S., Vickers S., Designing gaze gestures for gaming: An investigation of performance, in: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, in: ETRA ’10, Association for Computing Machinery, Austin, Texas, 2010, pp. 323–330,.
    [19]
    Jacob R.J.K., Eye movement-based human-computer interaction techniques: toward non-command interfaces, 2003.
    [20]
    Jungwirth F., Haslgrübler M., Ferscha A., Contour-guided gaze gestures: Using object contours as visual guidance for triggering interactions, in: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, in: ETRA ’18, Association for Computing Machinery, Warsaw, Poland, 2018, pp. 1–10,.
    [21]
    Juola J.F., Ward N.J., McNamara T., Visual search and reading of rapid serial presentations of letter strings, words, and text, J. Exp. Psychol. [Gen.] 111 (2) (1982) 208–227,.
    [22]
    Kangas J., Špakov O., Isokoski P., Akkil D., Rantala J., Raisamo R., Feedback for smooth pursuit gaze tracking based control, in: Proceedings of the 7th Augmented Human International Conference 2016, in: AH ’16, Association for Computing Machinery, Geneva, Switzerland, 2016, pp. 1–8,.
    [23]
    Khamis M., Oechsner C., Alt F., Bulling A., VRpursuits: Interaction in virtual reality using smooth pursuit eye movements, in: Proceedings of the 2018 International Conference on Advanced Visual Interfaces, in: AVI ’18, Association for Computing Machinery, New York, NY, USA, 2018, pp. 1–8,.
    [24]
    Khamis M., Saltuk O., Hang A., Stolz K., Bulling A., Alt F., TextPursuits: Using text for pursuits-based interaction and calibration on public displays, in: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, in: UbiComp ’16, Association for Computing Machinery, New York, NY, USA, 2016, pp. 274–285,.
    [25]
    Kurtenbach G., Moran T.P., Buxton W., Contextual animation of gestural commands, Comput. Graph. Forum 13 (5) (1994) 305–314,.
    [26]
    Lutteroth C., Penkar M., Weber G., Gaze vs. Mouse: A fast and accurate gaze-only click alternative, in: Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, in: UIST ’15, Association for Computing Machinery, Charlotte, NC, USA, 2015, pp. 385–394,.
    [27]
    Lutz O.H.-M., Venjakob A.C., Ruff S., SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements, J. Eye Mov. Res. 8 (1) (2015) https://bop.unibe.ch/index.php/jemr/article/view/2394.
    [28]
    Majaranta P., Ahola U.-K., Š.pakov O., Fast gaze typing with an adjustable dwell time, in: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, in: CHI ’09, Association for Computing Machinery, Boston, MA, USA, 2009, pp. 357–360,.
    [29]
    Majaranta P., Bulling A., Eye tracking and eye-based human–computer interaction, in: Fairclough S.H., Gilleade K. (Eds.), Advances in Physiological Computing, in: Human–Computer Interaction Series, Springer, London, 2014, pp. 39–65,.
    [30]
    Møllenbach E., Hansen J.P., Lillholm M., Eye movements in gaze interaction, J. Eye Mov. Res. 6 (2) (2013) https://bop.unibe.ch/index.php/jemr/article/view/2354.
    [31]
    Møllenbach E., Lillholm M., Gail A., Hansen J.P., Single gaze gestures, in: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, in: ETRA ’10, Association for Computing Machinery, Austin, Texas, 2010, pp. 177–180,.
    [32]
    Nazir T.A., O’Regan J.K., Jacobs A.M., On words and their letters, Bull. Psychonomic Soc. 29 (2) (1991) 171–174,.
    [33]
    Rozado D., Agustin J.S., Rodriguez F.B., Varona P., Gliding and saccadic gaze gesture recognition in real time, ACM Trans. Interact. Intell. Syst. 1 (2) (2012) 10:1–10:27,.
    [34]
    Rozado D., Moreno T., Agustin J.S., Rodriguez F.B., Varona P., Controlling a smartphone using gaze gestures as the input mechanism, Human–Comput. Interact. 30 (1) (2015) 34–63,. _eprint: DOI: 10.1080/07370024.2013.870385.
    [35]
    Rubin G.S., Turano K., Reading without saccadic eye movements, Vis. Res. 32 (5) (1992) 895–902,.
    [36]
    Schenk S., Tiefenbacher P., Rigoll G., Dorr M., SPOCK: A smooth pursuit oculomotor control kit, in: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, in: CHI EA ’16, Association for Computing Machinery, San Jose, California, USA, 2016, pp. 2681–2687,.
    [37]
    Sidenmark L., Clarke C., Zhang X., Phu J., Gellersen H., Outline pursuits: Gaze-assisted selection of occluded objects in virtual reality, in: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, New York, NY, USA, 2020, pp. 1–13.
    [38]
    Singh G., Delamare W., Irani P., D-SWIME: A design space for smartwatch interaction techniques supporting mobility and encumbrance, in: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, in: CHI ’18, Association for Computing Machinery, New York, NY, USA, 2018, pp. 1–13,.
    [39]
    Špakov O., Isokoski P., Kangas J., Akkil D., Majaranta P., PursuitAdjuster: An exploration into the design space of smooth pursuit –based widgets, in: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, in: ETRA ’16, Association for Computing Machinery, Charleston, South Carolina, 2016, pp. 287–290,.
    [40]
    Tuisku O., Majaranta P., Isokoski P., Räihä K.-J., Now Dasher! Dash away! longitudinal study of fast text entry by Eye Gaze, in: Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, in: ETRA ’08, Association for Computing Machinery, Savannah, Georgia, 2008, pp. 19–26,.
    [41]
    Velloso E., Coutinho F.L., Kurauchi A., Morimoto C.H., Circular orbits detection for gaze interaction using 2D correlation and profile matching algorithms, in: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, in: ETRA ’18, Association for Computing Machinery, Warsaw, Poland, 2018, pp. 1–9,.
    [42]
    Velloso E., Wirth M., Weichel C., Esteves A., Gellersen H., AmbiGaze: Direct control of ambient devices by gaze, in: Proceedings of the 2016 ACM Conference on Designing Interactive Systems, in: DIS ’16, Association for Computing Machinery, Brisbane, QLD, Australia, 2016, pp. 812–817,.
    [43]
    Vidal M., Bulling A., Gellersen H., Detection of smooth pursuits using eye movement shape features, in: Proceedings of the Symposium on Eye Tracking Research and Applications, in: ETRA ’12, Association for Computing Machinery, Santa Barbara, California, 2012, pp. 177–180,.
    [44]
    Vidal M., Bulling A., Gellersen H., Pursuits: Spontaneous interaction with displays based on smooth pursuit eye movement and moving targets, in: Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing, in: UbiComp ’13, Association for Computing Machinery, Zurich, Switzerland, 2013, pp. 439–448,.
    [45]
    Wang B., Grossman T., BlyncSync: Enabling multimodal smartwatch gestures with synchronous touch and blink, in: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, in: CHI ’20, Association for Computing Machinery, Honolulu, HI, USA, 2020, pp. 1–14,.
    [46]
    Wobbrock, J.O., Rubinstein, J., Sawyer, M., Duchowski, A.T., 2007. Not Typing but Writing: Eye-Based Text Entry Using Letter-like Gestures. In: Proceedings of the Conference on Communications by Gaze Interaction. COGAIN, pp. 61–64.
    [47]
    Wolf K., Naumann A., Rohs M., Müller J., A taxonomy of microinteractions: defining microgestures based on ergonomic and scenario-dependent requirements, in: Campos P., Graham N., Jorge J., Nunes N., Palanque P., Winckler M. (Eds.), Human-Computer Interaction – INTERACT 2011, in: Lecture Notes in Computer Science, Springer, Berlin, Heidelberg, 2011, pp. 559–575,.

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image International Journal of Human-Computer Studies
    International Journal of Human-Computer Studies  Volume 183, Issue C
    Mar 2024
    220 pages

    Publisher

    Academic Press, Inc.

    United States

    Publication History

    Published: 14 March 2024

    Author Tags

    1. Gaze
    2. Gesture
    3. Smartwatch
    4. Fireworks
    5. Guidance

    Qualifiers

    • Research-article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 0
      Total Downloads
    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 27 Jul 2024

    Other Metrics

    Citations

    View Options

    View options

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media