Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Gazture: Design and Implementation of a Gaze based Gesture Control System on Tablets

Published: 11 September 2017 Publication History

Abstract

We present Gazture, a light-weight gaze based real-time gesture control system on commercial tablets. Unlike existing approaches that require dedicated hardware (e.g., high resolution camera), high computation overhead (powerful CPU) or specific user behavior (keeping head steady), Gazture provides gesture recognition based on easy-to-control user gaze input with a small overhead. To achieve this goal, Gazture incorporates a two-layer structure: The first layer focuses on real-time gaze estimation with acceptable tracking accuracy while incurring a small overhead. The second layer implements a robust gesture recognition algorithm while compensating gaze estimation error. To address user posture change while using mobile device, we design a online transfer function based method to convert current eye features into corresponding eye features in reference posture, which then facilitates efficient gaze position estimation. We implement Gazture on Lenovo Tab3 8 Plus tablet with Android 6.0.1, and evaluate its performance in different scenarios. The evaluation results show that Gazture can achieve a high accuracy in gesture recognition while incurring a low overhead.

References

[1]
Snapdragon SDK: https://developer.qualcomm.com/software/snapdragon-sdk-android.
[2]
Tobii: http://www.tobii.com/.
[3]
Robust Fitting: https://en.wikipedia.org/wiki/Robust_regression.
[4]
Vads: Visual attention detection with a smartphone. In Proceedings of IEEE INFOCOM, 2016.
[5]
Rasoul Banaeeyan. Review on issues of eye gaze tracking systems for human computer interaction. Journal of Multidisciplinary Engineering Science and Technology (JMEST), 1(4):237--242, 2014.
[6]
F. L. Coutinho and C. H. Morimoto. Free head motion eye gaze tracking using a single camera and multiple light sources. In 2006 19th Brazilian Symposium on Computer Graphics and Image Processing, 2006.
[7]
Heiko Drewes and Albrecht Schmidt. Interacting with the computer using gaze gestures. In Proceedings of the 11th IFIP TC 13 International Conference on Human-computer Interaction - Volume Part II, INTERACT’07, pages 475--488, 2007.
[8]
Heiko Drewes, Alexander De Luca, and Albrecht Schmidt. Eye-gaze interaction for mobile phones. In Proceedings of the 4th International Conference on Mobile Technology, Applications, and Systems and the 1st International Symposium on Computer Human Interaction in Mobile Technology, Mobility ’07, pages 364--371, 2007.
[9]
Morten Lund Dybdal, Javier San Agustin, and John Paulin Hansen. Gaze input for mobile devices by dwell and gestures. In Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA ’12, pages 225--228, 2012.
[10]
Tianyu Wang Emiliano Miluzzo and Andrew T. Campbell. Eyephone: activating mobile phones with your eyes. In Proceedings of MobiHeld, 2010.
[11]
D. W. Hansen and Q. Ji. In the eye of the beholder: A survey of models for eyes and gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(3):478--500, 2010.
[12]
Corey Holland, Atenas Garza, Elena Kurtova, Jose Cruz, and Oleg Komogortsev. Usability evaluation of eye tracking on an unmodified common tablet. In CHI ’13 Extended Abstracts on Human Factors in Computing Systems, 2013.
[13]
Jari Kangas, Deepak Akkil, Jussi Rantala, Poika Isokoski, Päivi Majaranta, and Roope Raisamo. Gaze gestures and haptic feedback in mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’14, pages 435--438, 2014.
[14]
Mohamed Khamis, Florian Alt, Mariam Hassib, Emanuel von Zezschwitz, Regina Hasholzner, and Andreas Bulling. Gazetouchpass: Multimodal authentication using gaze and touch on mobile devices. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, CHI EA ’16, 2016.
[15]
Zhenjiang Li, Mo Li, Prasant Mohapatra, Jinsong Han, and Shuaiyu Chen. itype: Using eye gaze to enhance typing privacy. 2017.
[16]
Alexander Mariakakis, Mayank Goel, Md Tanvir Islam Aumi, Shwetak N. Patel, and Jacob O. Wobbrock. Switchback: Using focus and saccade tracking to guide users’ attention for mobile task resumption. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 2015.
[17]
Carlos H. Morimoto and Marcio R.M. Mimica. Eye gaze tracking techniques for interactive applications. Computer Vision and Image Understanding, 98(1):4--24, 2005.
[18]
Takehiko Ohno, Naoki Mukawa, and Atsushi Yoshikawa. Freegaze: A gaze tracking system for everyday gaze interaction. In Proceedings of the 2002 Symposium on Eye Tracking Research 8 Applications, 2002.
[19]
Ashutosh Sabharwal Qiong Huang, Ashok Veeraraghavan. Tabletgaze: A dataset and baseline algorithms for unconstrained appearance-based gaze estimation in mobile tablets. CoRR, abs/1508.01244, 2015.
[20]
Weston Sewell and Oleg Komogortsev. Real-time eye gaze tracking with an unmodified commodity webcam employing a neural network. In CHI ’10 Extended Abstracts on Human Factors in Computing Systems, 2010.
[21]
Anjana Sharma and Pawanesh Abrol. Eye gaze techniques for human computer interaction: A research survey. International Journal of Computer Applications, 71(9), 2013.
[22]
Sheng-Wen Shih, Yu-Te Wu, and Jin Liu. A calibration-free gaze tracking technique. In Pattern Recognition, 2000. Proceedings. 15th International Conference on, 2000.
[23]
Chen Song, Aosen Wang, Kui Ren, and Wenyao Xu. “eyeveri: A secure and usable approach for smartphone user authentication”. In IEEE International Conference on Computer Communication (INFOCOM’16), 2016.
[24]
Vytautas Vaitukaitis and Andreas Bulling. Eye gesture recognition on portable devices. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, 2012.
[25]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. Pursuits: Spontaneous eye-based interaction for dynamic interfaces. GetMobile: Mobile Comp. and Comm., 18(4):8--10, 2015.
[26]
J. Weaver, K. Mock, and B. Hoanca. Gaze-based password authentication through automatic clustering of gaze points. In Systems, Man, and Cybernetics (SMC), 2011 IEEE International Conference on, 2011.
[27]
Gjergji Kasneci Wolfgang Fuhl, Thiago Santini and Enkelejda Kasneci. Pupilnet: Convolutional neural networks for robust pupil detection. CoRR, abs/1601.04902, 2016.
[28]
Erroll Wood and Andreas Bulling. Eyetab: Model-based gaze estimation on unmodified tablet computers. In Proceedings of the Symposium on Eye Tracking Research and Applications, 2014.
[29]
W. Zhang, B. Cheng, and Y. Lin. Driver drowsiness recognition based on computer vision technology. Tsinghua Science and Technology, 17(3):354--362, 2012.
[30]
Yanxia Zhang, Andreas Bulling, and Hans Gellersen. Sideways: A gaze interface for spontaneous interaction with situated displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’13, 2013.
[31]
Yanxia Zhang, Andreas Bulling, and Hans Gellersen. Pupil-canthi-ratio: A calibration-free method for tracking horizontal gaze direction. In Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces, 2014.

Cited By

View all
  • (2023)GazeCast: Using Mobile Devices to Allow Gaze-based Interaction on Public DisplaysProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589663(1-8)Online publication date: 30-May-2023
  • (2022)Evaluating the Performance of Machine Learning Algorithms in Gaze Gesture Recognition SystemsIEEE Access10.1109/ACCESS.2021.313615310(1020-1035)Online publication date: 2022
  • (2022)Gaze-Assisted Viewport Control for 360° Video on SmartphoneJournal of Computer Science and Technology10.1007/s11390-022-2037-537:4(906-918)Online publication date: 30-Jul-2022
  • Show More Cited By

Index Terms

  1. Gazture: Design and Implementation of a Gaze based Gesture Control System on Tablets

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
    Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies  Volume 1, Issue 3
    September 2017
    2023 pages
    EISSN:2474-9567
    DOI:10.1145/3139486
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 11 September 2017
    Accepted: 01 July 2017
    Revised: 01 May 2017
    Received: 01 February 2017
    Published in IMWUT Volume 1, Issue 3

    Permissions

    Request permissions for this article.

    Check for updates

    Qualifiers

    • Research-article
    • Research
    • Refereed

    Funding Sources

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)18
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 02 Sep 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)GazeCast: Using Mobile Devices to Allow Gaze-based Interaction on Public DisplaysProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589663(1-8)Online publication date: 30-May-2023
    • (2022)Evaluating the Performance of Machine Learning Algorithms in Gaze Gesture Recognition SystemsIEEE Access10.1109/ACCESS.2021.313615310(1020-1035)Online publication date: 2022
    • (2022)Gaze-Assisted Viewport Control for 360° Video on SmartphoneJournal of Computer Science and Technology10.1007/s11390-022-2037-537:4(906-918)Online publication date: 30-Jul-2022
    • (2021)Gaze Gesture Recognition by Graph Convolutional NetworksFrontiers in Robotics and AI10.3389/frobt.2021.7099528Online publication date: 5-Aug-2021
    • (2021)Energy-Efficient Interactive 360° Video Streaming with Real-Time Gaze Tracking on Mobile Devices2021 IEEE 18th International Conference on Mobile Ad Hoc and Smart Systems (MASS)10.1109/MASS52906.2021.00040(243-251)Online publication date: Oct-2021
    • (2020)Are my Apps Peeking? Comparing Nudging Mechanisms to Raise Awareness of Access to Mobile Front-facing CameraProceedings of the 19th International Conference on Mobile and Ubiquitous Multimedia10.1145/3428361.3428384(186-190)Online publication date: 22-Nov-2020
    • (2020)From perception to action using observed actions to learn gesturesUser Modeling and User-Adapted Interaction10.1007/s11257-020-09275-3Online publication date: 24-Aug-2020
    • (2019)Poster: A Calibration-free Gaze based Mobile Gesture Control SystemProceedings of the 2019 International Conference on Embedded Wireless Systems and Networks10.5555/3324320.3324354(222-223)Online publication date: 25-Feb-2019
    • (2019)HandSenseProceedings of the 17th Conference on Embedded Networked Sensor Systems10.1145/3356250.3360040(285-297)Online publication date: 10-Nov-2019
    • (2018)CapBandProceedings of the 16th ACM Conference on Embedded Networked Sensor Systems10.1145/3274783.3274854(54-67)Online publication date: 4-Nov-2018
    • Show More Cited By

    View Options

    Get Access

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media