Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2901790.2901867acmconferencesArticle/Chapter ViewAbstractPublication PagesdisConference Proceedingsconference-collections
note

AmbiGaze: Direct Control of Ambient Devices by Gaze

Published: 04 June 2016 Publication History

Abstract

Eye tracking offers many opportunities for direct device control in smart environments, but issues such as the need for calibration and the Midas touch problem make it impractical. In this paper, we propose AmbiGaze, a smart environment that employs the animation of targets to provide users with direct control of devices by gaze only through smooth pursuit tracking. We propose a design space of means of exposing functionality through movement and illustrate the concept through four prototypes. We evaluated the system in a user study and found that AmbiGaze enables robust gaze-only interaction with many devices, from multiple positions in the environment, in a spontaneous and comfortable manner.

Supplementary Material

suppl.mov (pn0505-file3.mp4)
Supplemental video

References

[1]
Michael Beigl. 1999. Point & Click-Interaction in Smart Environments. In Handheld and Ubiquitous Computing (Lecture Notes in Computer Science), Hans. Gellersen (Ed.), Vol. 1707. Springer Berlin Heidelberg, 311--313.
[2]
Richard A. Bolt. 1981. Gaze-orchestrated Dynamic Windows. SIGGRAPH Comput. Graph. 15, 3 (Aug. 1981), 109--119.
[3]
Matthias Budde, Matthias Berning, Christopher Baumgärtner, Florian Kinn, Timo Kopf, Sven Ochs, Frederik Reiche, Till Riedel, and Michael Beigl. 2013. Point & Control -- Interaction in Smart Environments: You Only Click Twice. In Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication (UbiComp '13 Adjunct). ACM, New York, NY, USA, 303--306.
[4]
Marcus Carter, Eduardo Velloso, John Downs, Abigail Sellen, Kenton O'Hara, and Frank Vetere. 2016. PathSync: Multi-User Gestural Interaction with Touchless Rhythmic Path Mimicry. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 13.
[5]
Katie Derthick, James Scott, Nicolas Villar, and Christian Winkler. 2013. Exploring Smartphone-based Web User Interfaces for Appliances. In Proceedings of the 15th International Conference on Human-computer Interaction with Mobile Devices and Services (MobileHCI '13). ACM, New York, NY, USA, 227--236.
[6]
Connor Dickie, Jamie Hart, Roel Vertegaal, and Alex Eiser. 2006. LookPoint: An Evaluation of Eye Input for Hands-free Switching of Input Devices Between Multiple Computers. In Proceedings of the 18th Australia Conference on Computer-Human Interaction: Design: Activities, Artefacts and Environments (OZCHI '06). ACM, New York, NY, USA, 119--126.
[7]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015a. Orbits: Enabling Gaze Interaction in Smart Watches Using Moving Targets. In Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers (UbiComp/ISWC'15 Adjunct). ACM, New York, NY, USA, 419--422.
[8]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015b. Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements. In Proc. of the 28th ACM Symposium on User Interface Software and Technology (UIST 2015) (2015-11-01).
[9]
Jean-Daniel Fekete, Niklas Elmqvist, and Yves Guiard. 2009. Motion-pointing: Target Selection Using Elliptical Motions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). ACM, New York, NY, USA, 289--298.
[10]
David Fleer and Christian Leichsenring. 2012. MISO: A Context-sensitive Multimodal Interface for Smart Objects Based on Hand Gestures and Finger Snaps. In Adjunct Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology (UIST Adjunct Proceedings '12). ACM, New York, NY, USA, 93--94.
[11]
Jeremy Hales, David Rozado, and Diako Mardanbegi. 2013. Interacting with Objects in the Environment by Gaze and Hand Gestures. In 3rd International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI2013).
[12]
Jan Hess, Guy Küstermann, and Volkmar Pipek. 2008. Premote: A User Customizable Remote Control. In CHI '08 Extended Abstracts on Human Factors in Computing Systems (CHI EA '08). ACM, New York, NY, USA, 3279--3284.
[13]
Yoshio Ishiguro, Adiyan Mujibiya, Takashi Miyaki, and Jun Rekimoto. 2010. Aided Eyes: Eye Activity Sensing for Daily Life. In Proceedings of the 1st Augmented Human International Conference (AH '10). ACM, New York, NY, USA, Article 25, 7 pages.
[14]
Tiiu Koskela and Kaisa Väänänen-Vainio-Mattila. 2004. Evolution towards smart home environments: empirical evaluation of three user interfaces. Personal and Ubiquitous Computing 8, 3--4 (2004), 234--240.
[15]
Logitech. 2010. Logitech Study Shows Multiple Remote Controls Hindering Entertainment Experiences Around the Globe. Press Release. (Nov 2010). http: //www.logitech.com/en-us/press/press-releases/7748
[16]
Robert Neßelrath, Chensheng Lu, ChristianH. Schulz, Jochen Frey, and Jan Alexandersson. 2011. A Gesture Based System for Context Sensitive Interaction with Smart Homes. In Ambient Assisted Living, Reiner Wichert and Birgid Eberhardt (Eds.). Springer Berlin Heidelberg, 209--219.
[17]
Ken Pfeuffer, Mélodie Vidal, Jayson Turner, Andreas Bulling, and Hans Gellersen. 2013. Pursuit Calibration: Making Gaze Calibration Less Tedious and More Flexible. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST '13). ACM, New York, NY, USA, 261--270.
[18]
Christof Roduner, Marc Langheinrich, Christian Floerkemeier, and Beat Schwarzentrub. 2007. Operating Appliances with Mobile Phones'Strengths and Limits of a Universal Interaction Device. In Pervasive Computing (Lecture Notes in Computer Science), Anthony LaMarca, Marc Langheinrich, and Khai N. Truong (Eds.), Vol. 4480. Springer Berlin Heidelberg, 198--215.
[19]
Dominik Schmidt, David Molyneaux, and Xiang Cao. 2012. PICOntrol: Using a Handheld Projector for Direct Control of Physical Devices Through Visible Light. In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology (UIST '12). ACM, New York, NY, USA, 379--388.
[20]
Jeffrey S. Shell, Roel Vertegaal, Daniel Cheng, Alexander W. Skaburskis, Changuk Sohn, A. James Stewart, Omar Aoudeh, and Connor Dickie. 2004. ECSGlasses and EyePliances: Using Attention to Open Sociable Windows of Interaction. In Proceedings of the 2004 Symposium on Eye Tracking Research & Applications (ETRA '04). ACM, New York, NY, USA, 93--100.
[21]
Jeffrey S. Shell, Roel Vertegaal, and Alexander W. Skaburskis. 2003. EyePliances: Attention-seeking Devices That Respond to Visual Attention. In CHI '03 Extended Abstracts on Human Factors in Computing Systems (CHI EA '03). ACM, New York, NY, USA, 770--771.
[22]
Roel Vertegaal, Aadil Mamuji, Changuk Sohn, and Daniel Cheng. 2005. Media Eyepliances: Using Eye Tracking for Remote Control Focus Selection of Appliances. In CHI '05 Extended Abstracts on Human Factors in Computing Systems (CHI EA '05). ACM, New York, NY, USA, 1861--1864.
[23]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '13). ACM, New York, NY, USA, 439--448.
[24]
Mark Weiser and JohnSeely Brown. 1997. The Coming Age of Calm Technology. In Beyond Calculation. Springer New York, 75--85.
[25]
Andrew Wilson and Steven Shafer. 2003. XWand: UI for Intelligent Spaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '03). ACM, New York, NY, USA, 545--552.
[26]
Gottfried Zimmermann, Gregg Vanderheiden, and Al Gilman. 2002. Prototype Implementations for a Universal Remote Console Specification. In CHI '02 Extended Abstracts on Human Factors in Computing Systems (CHI EA '02). ACM, New York, NY, USA, 510--511.

Cited By

View all
  • (2024)Hand Me This: Exploring the Effects of Gaze-driven Animations and Hand Representations in Users’ Sense of Presence and EmbodimentProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656362(1-7)Online publication date: 4-Jun-2024
  • (2024)Reflected RealityProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314317:4(1-28)Online publication date: 12-Jan-2024
  • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
  • Show More Cited By

Index Terms

  1. AmbiGaze: Direct Control of Ambient Devices by Gaze

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    DIS '16: Proceedings of the 2016 ACM Conference on Designing Interactive Systems
    June 2016
    1374 pages
    ISBN:9781450340311
    DOI:10.1145/2901790
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 04 June 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eye tracking
    2. smart environments
    3. smooth pursuits
    4. ubiquitous computing

    Qualifiers

    • Note

    Conference

    DIS '16
    Sponsor:
    DIS '16: Designing Interactive Systems Conference 2016
    June 4 - 8, 2016
    QLD, Brisbane, Australia

    Acceptance Rates

    DIS '16 Paper Acceptance Rate 107 of 418 submissions, 26%;
    Overall Acceptance Rate 1,158 of 4,684 submissions, 25%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)68
    • Downloads (Last 6 weeks)8
    Reflects downloads up to 01 Sep 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Hand Me This: Exploring the Effects of Gaze-driven Animations and Hand Representations in Users’ Sense of Presence and EmbodimentProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656362(1-7)Online publication date: 4-Jun-2024
    • (2024)Reflected RealityProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314317:4(1-28)Online publication date: 12-Jan-2024
    • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
    • (2024)Design recommendations of target size and tracking speed under circular and square trajectories for smooth pursuit with Euclidean algorithm in eye-control systemDisplays10.1016/j.displa.2023.10260881(102608)Online publication date: Jan-2024
    • (2024)SoundOrbit: motion-correlation interaction with auditory orbital trajectoriesPersonal and Ubiquitous Computing10.1007/s00779-024-01818-4Online publication date: 15-Jun-2024
    • (2023)Vergence Matching: Inferring Attention to Objects in 3D Environments for Gaze-Assisted SelectionProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580685(1-15)Online publication date: 19-Apr-2023
    • (2022)GazeScale: Towards General Gaze-Based Interaction in Public PlacesProceedings of the 2022 International Conference on Multimodal Interaction10.1145/3536221.3556588(591-596)Online publication date: 7-Nov-2022
    • (2022)One-handed Input for Mobile Devices via Motion Matching and Orbits ControlsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35346246:2(1-24)Online publication date: 7-Jul-2022
    • (2022)User Perception of Smooth Pursuit Target Speed2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3529234(1-7)Online publication date: 8-Jun-2022
    • (2022)Design requirements to improve laparoscopy via XR2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW55335.2022.00093(425-429)Online publication date: Mar-2022
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media