Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3229434.3229452acmconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
research-article

The past, present, and future of gaze-enabled handheld mobile devices: survey and lessons learned

Published: 03 September 2018 Publication History

Abstract

While first-generation mobile gaze interfaces required special-purpose hardware, recent advances in computational gaze estimation and the availability of sensor-rich and powerful devices is finally fulfilling the promise of pervasive eye tracking and eye-based interaction on off-the-shelf mobile devices. This work provides the first holistic view on the past, present, and future of eye tracking on handheld mobile devices. To this end, we discuss how research developed from building hardware prototypes, to accurate gaze estimation on unmodified smartphones and tablets. We then discuss implications by laying out 1) novel opportunities, including pervasive advertising and conducting in-the-wild eye tracking studies on handhelds, and 2) new challenges that require further research, such as visibility of the user's eyes, lighting conditions, and privacy implications. We discuss how these developments shape MobileHCI research in the future, possibly the next 20 years.

References

[1]
Mohammed Eunus Ali, Anika Anwar, Ishrat Ahmed, Tanzima Hashem, Lars Kulik, and Egemen Tanin. 2014. Protecting Mobile Users from Visual Privacy Attacks. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp '14 Adjunct). ACM, New York, NY, USA, 1--4.
[2]
Hazim Almuhimedi, Florian Schaub, Norman Sadeh, Idris Adjerid, Alessandro Acquisti, Joshua Gluck, Lorrie Faith Cranor, and Yuvraj Agarwal. 2015. Your Location Has Been Shared 5,398 Times!: A Field Study on Mobile App Privacy Nudging. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 787--796.
[3]
Florian Alt, Andreas Bulling, Gino Gravanis, and Daniel Buschek. 2015. GravitySpot: Guiding Users in Front of Public Displays Using On-Screen Visual Cues. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 47--56.
[4]
Rebecca Balebako, Jaeyeon Jung, Wei Lu, Lorrie Faith Cranor, and Carolyn Nguyen. 2013. "Little Brothers Watching You": Raising Awareness of Data Leaks on Smartphones. In Proceedings of the Ninth Symposium on Usable Privacy and Security (SOUPS '13). ACM, New York, NY, USA, Article 12, 11 pages.
[5]
Tadas Baltrušitis, Peter Robinson, and Louis-Philippe Morency. 2016. OpenFace: An open source facial behavior analysis toolkit. In 2016 IEEE Winter Conference on Applications of Computer Vision (WACV). 1--10.
[6]
Philip J. Benson, Sara A. Beedie, Elizabeth Shephard, Ina Giegling, Dan Rujescu, and David St. Clair. 2012. Simple Viewing Tests Can Detect Eye Movement Abnormalities That Distinguish Schizophrenia Cases from Controls with Exceptional Accuracy. Biological Psychiatry 72, 9 (2012), 716--724.
[7]
Pascal Bérard, Derek Bradley, Maurizio Nitti, Thabo Beeler, and Markus Gross. 2014. High-quality Capture of Eyes. ACM Trans. Graph. 33, 6, Article 223 (Nov. 2014), 12 pages.
[8]
Ralf Biedert, Georg Buscher, Sven Schwarz, Jörn Hees, and Andreas Dengel. 2010. Text 2.0. In CHI '10 Extended Abstracts on Human Factors in Computing Systems (CHI EA '10). ACM, New York, NY, USA, 4003--4008.
[9]
Ralf Biedert, Andreas Dengel, Georg Buscher, and Arman Vartan. 2012. Reading and Estimating Gaze on Smart Phones. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 385--388.
[10]
Richard A. Bolt. 1981. Gaze-orchestrated Dynamic Windows. SIGGRAPH Comput. Graph. 15, 3 (Aug. 1981), 109--119.
[11]
Sebastian Boring, David Ledo, Xiang 'Anthony' Chen, Nicolai Marquardt, Anthony Tang, and Saul Greenberg. 2012. The Fat Thumb: Using the Thumb's Contact Size for Single-handed Mobile Interaction. In Proceedings of the 14th International Conference on Human-computer Interaction with Mobile Devices and Services (MobileHCI '12). ACM, New York, NY, USA, 39--48.
[12]
Leo Breiman. 2001. Random Forests. Machine Learning 45, 1 (01 Oct 2001), 5--32.
[13]
Christina Bröhl, Alexander Mertens, and Martina Ziefle. 2017. How Do Users Interact with Mobile Devices? An Analysis of Handheld Positions for Different Technology Generations. Springer International Publishing, Cham, 3--16.
[14]
Frederik Brudy, David Ledo, Saul Greenberg, and Andreas Butz. 2014. Is Anyone Looking? Mitigating Shoulder Surfing on Public Displays Through Awareness and Protection. In Proceedings of The International Symposium on Pervasive Displays (PerDis '14). ACM, New York, NY, USA, Article 1, 6 pages.
[15]
Andreas Bulling. 2016. Pervasive Attentive User Interfaces. Computer 49, 1 (Jan. 2016), 94--98.
[16]
Andreas Bulling and Hans Gellersen. 2010. Toward Mobile Eye-Based Human-Computer Interaction. IEEE Pervasive Computing 9, 4 (October 2010), 8--12.
[17]
Andreas Bulling and Kai Kunze. 2016. Eyewear Computers for Human-computer Interaction. interactions 23, 3 (April 2016), 70--73.
[18]
Jessica R. Cauchard, Markus Löchtefeld, Pourang Irani, Johannes Schoening, Antonio Krüger, Mike Fraser, and Sriram Subramanian. 2011. Visual Separation in Mobile Multi-display Environments. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (UIST '11). ACM, New York, NY, USA, 451--460.
[19]
Liwei Chan and Kouta Minamizawa. 2017. FrontFace: Facilitating Communication Between HMD Users and Outsiders Using Front-facing-screen HMDs. In Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '17). ACM, New York, NY, USA, Article 22, 5 pages.
[20]
Shiwei Cheng. 2011. The Research Framework of Eye-tracking Based Mobile Device Usability Evaluation. In Proceedings of the 1st International Workshop on Pervasive Eye Tracking & Mobile Eye-based Interaction (PETMEI '11). ACM, New York, NY, USA, 21--26.
[21]
T.F. Cootes, C.J. Taylor, D.H. Cooper, and J. Graham. 1995. Active Shape Models-Their Training and Application. Computer Vision and Image Understanding 61, 1 (1995), 38 -- 59.
[22]
Cira Cuadrat Seix, Montserrat Sendín Veloso, and Juan José Rodríguez Soler. 2012. Towards the Validation of a Method for Quantitative Mobile Usability Testing Based on Desktop Eyetracking. In Proceedings of the 13th International Conference on InteracciÓN Persona-Ordenador (INTERACCION '12). ACM, New York, NY, USA, Article 49, 8 pages.
[23]
Charles Denis and Laurent Karsenty. 2005. Inter-Usability of Multi-Device Systems - A Conceptual Framework. John Wiley & Sons, Ltd, 373--385.
[24]
Connor Dickie, Roel Vertegaal, Changuk Sohn, and Daniel Cheng. 2005. eyeLook: Using Attention to Facilitate Mobile Media Consumption. In Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology (UIST '05). ACM, New York, NY, USA, 103--106.
[25]
Heiko Drewes, Alexander De Luca, and Albrecht Schmidt. 2007. Eye-gaze Interaction for Mobile Phones. In Proceedings of the 4th International Conference on Mobile Technology, Applications, and Systems and the 1st International Symposium on Computer Human Interaction in Mobile Technology (Mobility '07). ACM, New York, NY, USA, 364--371.
[26]
Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the Computer Using Gaze Gestures. Springer Berlin Heidelberg, Berlin, Heidelberg, 475--488.
[27]
Andrew T. Duchowski. 2002. A breadth-first survey of eye-tracking applications. Behavior Research Methods, Instruments, & Computers 34, 4 (01 Nov 2002), 455--470.
[28]
Morten Lund Dybdal, Javier San Agustin, and John Paulin Hansen. 2012. Gaze Input for Mobile Devices by Dwell and Gestures. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 225--228.
[29]
Maria K. Eckstein, BelÃl'n Guerra-Carrillo, Alison T. Miller Singley, and Silvia A. Bunge. 2017. Beyond eye gaze: What else can eyetracking reveal about cognition and cognitive development? Developmental Cognitive Neuroscience 25 (2017), 69 -- 91. Sensitive periods across development.
[30]
Serge Egelman, Sakshi Jain, Rebecca S. Portnoff, Kerwell Liao, Sunny Consolvo, and David Wagner. 2014. Are You Ready to Lock?. In Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security (CCS '14). ACM, New York, NY, USA, 750--761.
[31]
Malin Eiband, Mohamed Khamis, Emanuel von Zezschwitz, Heinrich Hussmann, and Florian Alt. 2017. Understanding Shoulder Surfing in the Wild: Stories from Users and Observers. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, New York, NY, USA, 11.
[32]
Hanene Elleuch, Ali Wali, and Adel M. Alimi. 2014. Smart Tablet Monitoring by a Real-Time Head Movement and Eye Gestures Recognition System. In 2014 International Conference on Future Internet of Things and Cloud. 393--398.
[33]
Hanene Elleuch, Ali Wali, Anis Samet, and Adel M. Alimi. 2016. A Real-Time Eye Gesture Recognition System Based on Fuzzy Inference System for Mobile Devices Monitoring. Springer International Publishing, Cham, 172--180.
[34]
Hanene Elleuch, Ali Wali, Anis Samet, and Adel M. Alimi. 2017. Interacting with mobile devices by fusion eye and hand gestures recognition systems based on decision tree approach. (2017).
[35]
Ioannis Giannopoulos, Peter Kiefer, and Martin Raubal. 2012. GeoGazemarks: Providing Gaze History for the Orientation on Small Display Maps. In Proceedings of the 14th ACM International Conference on Multimodal Interaction (ICMI '12). ACM, New York, NY, USA, 165--172.
[36]
Gaurav Goswami, Mayank Vatsa, and Richa Singh. 2014. RGB-D Face Recognition With Texture and Attribute Features. Trans. Info. For. Sec. 9, 10 (Oct. 2014), 1629--1640.
[37]
Jari Hannuksela, Pekka Sangi, Markus Turtinen, and Janne Heikkilä. 2008. Face Tracking for Spatially Aware Mobile User Interfaces. Springer Berlin Heidelberg, Berlin, Heidelberg, 405--412.
[38]
Oliver Hohlfeld, André Pomp, Jó Ágila Bitsch Link, and Dennis Guse. 2015. On the Applicability of Computer Vision Based Gaze Tracking in Mobile Scenarios. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '15). ACM, New York, NY, USA, 427--434.
[39]
Corey Holland, Atenas Garza, Elena Kurtova, Jose Cruz, and Oleg Komogortsev. 2013. Usability Evaluation of Eye Tracking on an Unmodified Common Tablet. In CHI '13 Extended Abstracts on Human Factors in Computing Systems (CHI EA '13). ACM, New York, NY, USA, 295--300.
[40]
Corey Holland and Oleg Komogortsev. 2012. Eye Tracking on Unmodified Common Tablets: Challenges and Solutions. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 277--280.
[41]
Michael Xuelin Huang, Jiajia Li, Grace Ngai, and Hong Va Leong. 2017a. ScreenGlint: Practical, In-situ Gaze Estimation on Smartphones. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, New York, NY, USA, 2546--2557.
[42]
Qiong Huang, Ashok Veeraraghavan, and Ashutosh Sabharwal. 2017b. TabletGaze: dataset and analysis for unconstrained appearance-based gaze estimation in mobile tablets. Machine Vision and Applications 28, 5 (01 Aug 2017), 445--461.
[43]
Apple Inc. 2018. Use gestures to navigate your iPhone X. website. (30 January 2018). Retrieved February 8, 2018 from https://support.apple.com/en-bn/HT208204.
[44]
Shoya Ishimaru, Kai Kunze, Yuzuko Utsumi, Masakazu Iwamura, and Koichi Kise. 2013. Where Are You Looking At? - Feature-Based Eye Tracking on Unmodified Tablets. In 2013 2nd IAPR Asian Conference on Pattern Recognition. 738--739.
[45]
Robert J. Jacob and Keith S. Karn. 2003. Commentary on Section 4. Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises. Elsevier, Chapter 26, 573--607.
[46]
Shahram Jalaliniya and Diako Mardanbegi. 2016. EyeGrip: Detecting Targets in a Series of Uni-directional Moving Objects Using Optokinetic Nystagmus Eye Movements. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 5801--5811.
[47]
Yangqing Jia, Evan Shelhamer, Jeff Donahue, Sergey Karayev, Jonathan Long, Ross Girshick, Sergio Guadarrama, and Trevor Darrell. 2014. Caffe: Convolutional Architecture for Fast Feature Embedding. In Proceedings of the 22Nd ACM International Conference on Multimedia (MM '14). ACM, New York, NY, USA, 675--678.
[48]
Zhiping Jiang, Jinsong Han, Chen Qian, Wei Xi, Kun Zhao, Han Ding, Shaojie Tang, Jizhong Zhao, and Panlong Yang. 2016. VADS: Visual attention detection with a smartphone. In IEEE INFOCOM 2016 - The 35th Annual IEEE International Conference on Computer Communications. 1--9.
[49]
Jari Kangas, Deepak Akkil, Jussi Rantala, Poika Isokoski, Päivi Majaranta, and Roope Raisamo. 2014. Gaze Gestures and Haptic Feedback in Mobile Devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). ACM, New York, NY, USA, 435--438.
[50]
Chiao-Wen Kao, Che-Wei Yang, Kuo-Chin Fan, Bor-Jiunn Hwang, and Chin-Pan Huang. 2011. An adaptive eye gaze tracker system in the integrated cloud computing and mobile device. In 2011 International Conference on Machine Learning and Cybernetics, Vol. 1. 367--371.
[51]
Shinjiro Kawato, Nobuji Tetsutani, and Kenichi Hosaka. 2005. Scale-Adaptive Face Detection and Tracking in Real Time with SSR Filters and Support Vector Machine*This Paper Was Presented at ACCV 2004. IEICE - Trans. Inf. Syst. E88-D, 12 (Dec. 2005), 2857--2863.
[52]
Mohamed Khamis, Florian Alt, Mariam Hassib, Emanuel von Zezschwitz, Regina Hasholzner, and Andreas Bulling. 2016. GazeTouchPass: Multimodal Authentication Using Gaze and Touch on Mobile Devices. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '16). ACM, New York, NY, USA, 2156--2164.
[53]
Mohamed Khamis, Anita Baier, Niels Henze, Florian Alt, and Andreas Bulling. 2018a. Understanding Face and Eye Visibility in Front-Facing Cameras of Smartphones used in the Wild. (2018).
[54]
Mohamed Khamis, Daniel Buschek, Tobias Thieron, Florian Alt, and Andreas Bulling. 2018b. EyePACT: Eye-Based Parallax Correction on Touch-Enabled Interactive Displays. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 4, Article 146 (Jan. 2018), 18 pages.
[55]
Mohamed Khamis, Regina Hasholzner, Andreas Bulling, and Florian Alt. 2017a. GTmoPass: Two-factor Authentication on Public Displays Using Gaze-touch Passwords and Personal Mobile Devices. In Proceedings of the 6th ACM International Symposium on Pervasive Displays (PerDis '17). ACM, New York, NY, USA, Article 8, 9 pages.
[56]
Mohamed Khamis, Mariam Hassib, Emanuel von Zezschwitz, Andreas Bulling, and Florian Alt. 2017b. GazeTouchPIN: Protecting Sensitive Data on Mobile Devices Using Secure Multimodal Authentication. In Proceedings of the 19th ACM International Conference on Multimodal Interaction (ICMI 2017). ACM, New York, NY, USA, 446--450.
[57]
Mohamed Khamis, Axel Hoesl, Alexander Klimczak, Martin Reiss, Florian Alt, and Andreas Bulling. 2017c. EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST '17). ACM, New York, NY, USA, 155--166.
[58]
Jaewon Kim, Paul Thomas, Ramesh Sankaranarayana, Tom Gedeon, and Hwan-Jin Yoon. 2015. Eye-tracking analysis of user behavior and performance in web search on large and small screens. Journal of the Association for Information Science and Technology 66, 3 (2015), 526--544.
[59]
Jaewon Kim, Paul Thomas, Ramesh Sankaranarayana, Tom Gedeon, and Hwan-Jin Yoon. 2016a. Pagination Versus Scrolling in Mobile Web Search. In Proceedings of the 25th ACM International on Conference on Information and Knowledge Management (CIKM '16). ACM, New York, NY, USA, 751--760.
[60]
Jaewon Kim, Paul Thomas, Ramesh Sankaranarayana, Tom Gedeon, and Hwan-Jin Yoon. 2016b. Understanding eye movements on mobile devices for better presentation of search results. Journal of the Association for Information Science and Technology 67, 11 (2016), 2607--2619.
[61]
Jaewon Kim, Paul Thomas, Ramesh Sankaranarayana, Tom Gedeon, and Hwan-Jin Yoon. 2017. What Snippet Size is Needed in Mobile Web Search?. In Proceedings of the 2017 Conference on Conference Human Information Interaction and Retrieval (CHIIR '17). ACM, New York, NY, USA, 97--106.
[62]
Dominik Kirst and Andreas Bulling. 2016. On the Verge: Voluntary Convergences for Accurate and Precise Timing of Gaze Input. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '16). ACM, New York, NY, USA, 1519--1525.
[63]
Jesper Kjeldskov and Jeni Paay. 2012. A Longitudinal Review of Mobile HCI Research Methods. In Proceedings of the 14th International Conference on Human-computer Interaction with Mobile Devices and Services (MobileHCI '12). ACM, New York, NY, USA, 69--78.
[64]
Jesper Kjeldskov and Mikael B. Skov. 2014. Was It Worth the Hassle?: Ten Years of Mobile HCI Research Discussions on Lab and Field Evaluations. In Proceedings of the 16th International Conference on Human-computer Interaction with Mobile Devices & Services (MobileHCI '14). ACM, New York, NY, USA, 43--52.
[65]
Thomas Kosch, Mariam Hassib, Pawel Wozniak, Daniel Buschek, and Florian Alt. 2018. Your Eyes Tell: L everaging Smooth Pursuit for Assessing Cognitive Workload. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18). ACM, New York, NY, USA.
[66]
Kyle Krafka, Aditya Khosla, Petr Kellnhofer, Harini Kannan, Suchendra Bhandarkar, Wojciech Matusik, and Antonio Torralba. 2016. Eye Tracking for Everyone. In 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 2176--2184.
[67]
Kai Kunze, Shoya Ishimaru, Yuzuko Utsumi, and Koichi Kise. 2013. My Reading Life: Towards Utilizing Eyetracking on Unmodified Tablets and Phones. In Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication (UbiComp '13 Adjunct). ACM, New York, NY, USA, 283--286.
[68]
Christian Lander, Felix Kosmalla, Frederik Wiehr, and Sven Gehring. 2017. Using Corneal Imaging for Measuring a Human's Visual Attention. In Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers (UbiComp '17). ACM, New York, NY, USA, 947--952.
[69]
David Lee, KyoungHee Son, Joon Hyub Lee, and Seok-Hyung Bae. 2012. PhantomPen: Virtualization of Pen Head for Digital Drawing Free from Pen Occlusion & Visual Parallax. In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology (UIST '12). ACM, New York, NY, USA, 331--340.
[70]
Jong-In Lee, Sunjun Kim, Masaaki Fukumoto, and Byungjoo Lee. 2017. Reflector: Distance-Independent, Private Pointing on a Reflective Screen. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST '17). ACM, New York, NY, USA, 351--364.
[71]
Vincent Lepetit, Francesc Moreno-Noguer, and Pascal Fua. 2008. EPnP: An Accurate O(n) Solution to the PnP Problem. International Journal of Computer Vision 81, 2 (19 Jul 2008), 155.
[72]
Yinghui Li, Zhichao Cao, and Jiliang Wang. 2017a. Gazture: Design and Implementation of a Gaze Based Gesture Control System on Tablets. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 3, Article 74 (Sept. 2017), 17 pages.
[73]
Zhenjiang Li, Mo Li, Prasant Mohapatra, Jinsong Han, and Shuaiyu Chen. 2017b. iType: Using eye gaze to enhance typing privacy. In IEEE INFOCOM 2017 - IEEE Conference on Computer Communications. 1--9.
[74]
Dachuan Liu, Bo Dong, Xing Gao, and Haining Wang. 2015. Exploiting Eye Tracking for Smartphone Authentication. Springer International Publishing, Cham, 457--477.
[75]
Kristian Lukander. 2004. Measuring Gaze Point on Handheld Mobile Devices. In CHI '04 Extended Abstracts on Human Factors in Computing Systems (CHI EA '04). ACM, New York, NY, USA, 1556--1556.
[76]
Christian Mai, Lukas Rambold, and Mohamed Khamis. 2017. TransparentHMD: Revealing the HMD User's Face to Bystanders. In Proceedings of the 16th International Conference on Mobile and Ubiquitous Multimedia (MUM '17). ACM, New York, NY, USA, 515--520.
[77]
Päivi Majaranta and Andreas Bulling. 2014. Eye Tracking and Eye-Based Human-Computer Interaction. Springer London, 39--65.
[78]
Subhransu Maji, Alexander C. Berg, and Jitendra Malik. 2008. Classification using intersection kernel support vector machines is efficient. In 2008 IEEE Conference on Computer Vision and Pattern Recognition. 1--8.
[79]
Khalid Majrashi, Margaret Hamilton, and Alexandra L. Uitdenbogerd. 2014. Cross-platform Usability and Eye-tracking Measurement and Analysis Model. In Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures: The Future of Design (OzCHI '14). ACM, New York, NY, USA, 418--421.
[80]
Khalid Majrashi, Margaret Hamilton, and Alexandra L. Uitdenbogerd. 2016. Correlating Cross-platform Usability Problems with Eye Tracking Patterns. In Proceedings of the 30th International BCS Human Computer Interaction Conference: Fusion! (HCI '16). BCS Learning & Development Ltd., Swindon, UK, Article 40, 11 pages.
[81]
Diako Mardanbegi, Dan Witzner Hansen, and Thomas Pederson. 2012. Eye-based Head Gestures. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 139--146.
[82]
Alex Mariakakis, Jacob Baudin, Eric Whitmire, Vardhman Mehta, Megan A. Banks, Anthony Law, Lynn Mcgrath, and Shwetak N. Patel. 2017. PupilScreen: Using Smartphones to Assess Traumatic Brain Injury. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 3, Article 81 (Sept. 2017), 27 pages.
[83]
Alexander Mariakakis, Mayank Goel, Md Tanvir Islam Aumi, Shwetak N. Patel, and Jacob O. Wobbrock. 2015. SwitchBack: Using Focus and Saccade Tracking to Guide Users' Attention for Mobile Task Resumption. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 2953--2962.
[84]
Emiliano Miluzzo, Tianyu Wang, and Andrew T. Campbell. 2010. EyePhone: Activating Mobile Phones with Your Eyes. In Proceedings of the Second ACM SIGCOMM Workshop on Networking, Systems, and Applications on Mobile Handhelds (MobiHeld '10). ACM, New York, NY, USA, 15--20.
[85]
Takashi Nagamatsu, Michiya Yamamoto, and Hiroshi Sato. 2010. MobiGaze: Development of a Gaze Interface for Handheld Mobile Devices. In CHI '10 Extended Abstracts on Human Factors in Computing Systems (CHI EA '10). ACM, New York, NY, USA, 3349--3354.
[86]
Gustav Öquist and Kristin Lundin. 2007. Eye Movement Study of Reading Text on a Mobile Phone Using Paging, Scrolling, Leading, and RSVP. In Proceedings of the 6th International Conference on Mobile and Ubiquitous Multimedia (MUM '07). ACM, New York, NY, USA, 176--183.
[87]
Lucas Paletta, Helmut Neuschmied, Michael Schwarz, Gerald Lodron, Martin Pszeida, Stefan Ladstätter, and Patrick Luley. 2014. Smartphone Eye Tracking Toolbox: Accurate Gaze Recovery on Mobile Displays. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, New York, NY, USA, 367--68.
[88]
Lucas Paletta, Stephanie Schwarz, Jan Bobeth, Michael Schwarz, and Manfred Tscheligi. 2015. Gaze Analysis in Mobile Pedestrians Navigation: Socio-Cultural Aspects and Wayfinding. In Proceedings of the 2nd International Workshop on Solutions for Automatic Gaze Data Analysis.
[89]
Thies Pfeiffer, Marc Erich Latoschik, and Ipke Wachsmuth. 2008. Evaluation of binocular eye trackers and algorithms for 3D gaze interaction in virtual reality environments. JVRB - Journal of Virtual Reality and Broadcasting 5, 16 (2008), 1660.
[90]
Ken Pfeuffer and Hans Gellersen. 2016. Gaze and Touch Interaction on Tablets. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST '16). ACM, New York, NY, USA, 301--311.
[91]
Bastian Pfleging, Drea K. Fekety, Albrecht Schmidt, and Andrew L. Kun. 2016. A Model Relating Pupil Diameter to Mental Workload and Lighting Conditions. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 5776--5788.
[92]
Carmelo Pino and Isaak Kavasidis. 2012. Improving mobile device interaction by eye tracking analysis. In 2012 Federated Conference on Computer Science and Information Systems (FedCSIS). 1199--1202.
[93]
Thammathip Piumsomboon, Gun Lee, Robert W. Lindeman, and Mark Billinghurst. 2017. Exploring natural eye-gaze-based interaction for immersive virtual reality. In 2017 IEEE Symposium on 3D User Interfaces (3DUI). 36--39.
[94]
Pardha S. Pyla, Manas Tungare, Jerome Holman, and Manuel A. Pérez-Quiñones. 2009. Continuous User Interfaces for Seamless Task Migration. Springer Berlin Heidelberg, Berlin, Heidelberg, 77--85.
[95]
Qing-Xing Qu, Le Zhang, Wen-Yu Chao, and Vincent Duffy. 2017. User Experience Design Based on Eye-Tracking Technology: A Case Study on Smartphone APPs. Springer International Publishing, Cham, 303--315.
[96]
Jussi Rantala, PÃd'ivi Majaranta, Jari Kangas, Poika Isokoski, Deepak Akkil, Oleg Åăpakov, and Roope Raisamo. 2017. Gaze Interaction With Vibrotactile Feedback: Review and Design Guidelines. Human-Computer Interaction 0, 0 (2017), 1--39.
[97]
Milena Roetting and Olaf Zukunft. 2014. Don't touch that tablet: An evaluation of gaze-based interfaces for tablet computers. In 6th International Conference on Mobile Computing, Applications and Services. 49--56.
[98]
D. Rozado, T. Moreno, J. San Agustin, F. B. Rodriguez, and P. Varona. 2015. Controlling a Smartphone Using Gaze Gestures As the Input Mechanism. Hum.-Comput. Interact. 30, 1 (Jan. 2015), 34--63.
[99]
Matthias Seeger. 2004. Gaussian Processes for Machine Learning. International Journal of Neural Systems 14, 02 (2004), 69--106. 15112367.
[100]
Brian A. Smith, Qi Yin, Steven K. Feiner, and Shree K. Nayar. 2013. Gaze Locking: Passive Eye Contact Detection for Human-object Interaction. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST '13). ACM, New York, NY, USA, 271--280.
[101]
Chen Song, Aosen Wang, Kui Ren, and Wenyao Xu. 2016. "EyeVeri: A Secure and Usable Approach for Smartphone User Authentication". In IEEE International Conference on Computer Communication (INFOCOM'16). San Francisco, California, 1 -- 9.
[102]
Yusuke Sugano, Yasuyuki Matsushita, and Yoichi Sato. 2014. Learning-by-Synthesis for Appearance-Based 3D Gaze Estimation. In 2014 IEEE Conference on Computer Vision and Pattern Recognition. 1821--1828.
[103]
Yusuke Sugano, Xucong Zhang, and Andreas Bulling. 2016. AggreGaze: Collective Estimation of Audience Attention on Public Displays. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST '16). ACM, New York, NY, USA, 821--831.
[104]
Po-He Tseng, Ian G. M. Cameron, Giovanna Pari, James N. Reynolds, Douglas P. Munoz, and Laurent Itti. 2013. High-throughput classification of clinical populations from natural viewing eye movements. Journal of Neurology 260, 1 (01 Jan 2013), 275--284.
[105]
Jayson Turner, Jason Alexander, Andreas Bulling, Dominik Schmidt, and Hans Gellersen. 2013a. Eye Pull, Eye Push: Moving Objects between Large Screens and Personal Devices with Gaze and Touch. In Human-Computer Interaction - INTERACT 2013. Lecture Notes in Computer Science, Vol. 8118. Springer Berlin Heidelberg, 170--186.
[106]
Jayson Turner, Andreas Bulling, Jason Alexander, and Hans Gellersen. 2013b. Eye Drop: An Interaction Concept for Gaze-supported Point-to-point Content Transfer. In Proceedings of the 12th International Conference on Mobile and Ubiquitous Multimedia (MUM '13). ACM, New York, NY, USA, Article 37, 4 pages.
[107]
Jayson Turner, Andreas Bulling, Jason Alexander, and Hans Gellersen. 2014. Cross-device Gaze-supported Point-to-point Content Transfer. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, New York, NY, USA, 19--26.
[108]
Vytautas Vaitukaitis and Andreas Bulling. 2012. Eye Gesture Recognition on Portable Devices. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing (UbiComp '12). ACM, New York, NY, USA, 711--714.
[109]
Davide Valeriani and Ana Matran-Fernandez. 2015. Towards a wearable device for controlling a smartphone with eye winks. In 2015 7th Computer Science and Electronic Engineering Conference (CEEC). 41--46.
[110]
Roel Vertegaal. 2003. Attentive user interfaces. Commun. ACM 46, 3 (2003), 30--33.
[111]
Roel Vertegaal, Connor Dickie, Changuk Sohn, and Myron Flickner. 2002. Designing Attentive Cell Phone Using Wearable Eyecontact Sensors. In CHI '02 Extended Abstracts on Human Factors in Computing Systems (CHI EA '02). ACM, New York, NY, USA, 646--647.
[112]
Vladimir Vezhnevets and Anna Degtiareva. 2003. Robust and accurate eye contour extraction. In Graphicon.
[113]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '13). ACM, New York, NY, USA, 439--448.
[114]
Paul Viola and Michael J. Jones. 2004. Robust Real-Time Face Detection. International Journal of Computer Vision 57, 2 (2004), 137--154.
[115]
Raphael Wimmer. 2011. Grasp Sensing for Human-computer Interaction. In Proceedings of the Fifth International Conference on Tangible, Embedded, and Embodied Interaction (TEI '11). ACM, New York, NY, USA, 221--228.
[116]
Erroll Wood, Tadas Baltruaitis, Xucong Zhang, Yusuke Sugano, Peter Robinson, and Andreas Bulling. 2015. Rendering of Eyes for Eye-Shape Registration and Gaze Estimation. In Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV) (ICCV '15). IEEE Computer Society, Washington, DC, USA, 3756--3764.
[117]
Erroll Wood and Andreas Bulling. 2014. EyeTab: Model-based Gaze Estimation on Unmodified Tablet Computers. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, New York, NY, USA, 207--210.
[118]
Xuehan Xiong, Zicheng Liu, Qin Cai, and Zhengyou Zhang. 2014. Eye Gaze Tracking Using an RGBD Camera: A Comparison with a RGB Solution. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp '14 Adjunct). ACM, New York, NY, USA, 1113--1121.
[119]
Xiaoyi Zhang, Harish Kulkarni, and Meredith Ringel Morris. 2017. Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, New York, NY, USA, 2878--2889.
[120]
Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2017. It's Written All Over Your Face: Full-Face Appearance-Based Gaze Estimation. In 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). 2299--2308.
[121]
Yanxia Zhang, Andreas Bulling, and Hans Gellersen. 2013. SideWays: A Gaze Interface for Spontaneous Interaction with Situated Displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 851--860.
[122]
Yanxia Zhang, Jörg Müller, Ming Ki Chong, Andreas Bulling, and Hans Gellersen. 2014. GazeHorizon: Enabling Passers-by to Interact with Public Displays by Gaze. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '14). ACM, New York, NY, USA, 559--563.
[123]
Huiyuan Zhou, Vinicius Ferreira, Thamara Alves, Kirstie Hawkey, and Derek Reilly. 2015. Somebody Is Peeking!: A Proximity and Privacy Aware Tablet Interface. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '15). ACM, New York, NY, USA, 1971--1976.
[124]
Qiushi Zhou and Eduardo Velloso. 2017. GazeGrip: Improving Mobile Device Accessibility with Gaze & Grip Interaction. In Proceedings of the 29th Australian Conference on Computer-Human Interaction (OZCHI '17). ACM, New York, NY, USA, 467--471.

Cited By

View all
  • (2024)PrivateGaze: Preserving User Privacy in Black-box Mobile Gaze Tracking ServicesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785958:3(1-28)Online publication date: 9-Sep-2024
  • (2024)Where Do You Look When Unlocking Your Phone? : A Field Study of Gaze Behaviour During Smartphone UnlockExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3651094(1-7)Online publication date: 11-May-2024
  • (2024)Eye-tracking AD: Cutting-Edge Web Advertising on Smartphone Aligned with User’s Gaze2024 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops)10.1109/PerComWorkshops59983.2024.10502602(469-474)Online publication date: 11-Mar-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
MobileHCI '18: Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services
September 2018
552 pages
ISBN:9781450358989
DOI:10.1145/3229434
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 03 September 2018

Permissions

Request permissions for this article.

Check for updates

Badges

  • Honorable Mention

Author Tags

  1. eye tracking
  2. gaze estimation
  3. gaze interaction
  4. mobile devices
  5. smartphones
  6. tablets

Qualifiers

  • Research-article

Funding Sources

  • Bavarian State Ministry of Education, Science and the Arts
  • Saarland University, Germany

Conference

MobileHCI '18
Sponsor:

Acceptance Rates

Overall Acceptance Rate 202 of 906 submissions, 22%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)90
  • Downloads (Last 6 weeks)6
Reflects downloads up to 15 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)PrivateGaze: Preserving User Privacy in Black-box Mobile Gaze Tracking ServicesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785958:3(1-28)Online publication date: 9-Sep-2024
  • (2024)Where Do You Look When Unlocking Your Phone? : A Field Study of Gaze Behaviour During Smartphone UnlockExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3651094(1-7)Online publication date: 11-May-2024
  • (2024)Eye-tracking AD: Cutting-Edge Web Advertising on Smartphone Aligned with User’s Gaze2024 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops)10.1109/PerComWorkshops59983.2024.10502602(469-474)Online publication date: 11-Mar-2024
  • (2024)Evaluating Target Expansion for Eye Pointing TasksInteracting with Computers10.1093/iwc/iwae00436:4(209-223)Online publication date: 27-Feb-2024
  • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
  • (2023)A Study on Eye Tracking for Mobile Devices Using Deep LearningProceedings of the 24th International Conference on Computer Systems and Technologies10.1145/3606305.3606326(65-69)Online publication date: 16-Jun-2023
  • (2023)Investigating Privacy Perceptions and Subjective Acceptance of Eye Tracking on Handheld Mobile DevicesProceedings of the ACM on Human-Computer Interaction10.1145/35911337:ETRA(1-16)Online publication date: 18-May-2023
  • (2023)DynamicRead: Exploring Robust Gaze Interaction Methods for Reading on Handheld Mobile Devices under Dynamic ConditionsProceedings of the ACM on Human-Computer Interaction10.1145/35911277:ETRA(1-17)Online publication date: 18-May-2023
  • (2023)GazeCast: Using Mobile Devices to Allow Gaze-based Interaction on Public DisplaysProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589663(1-8)Online publication date: 30-May-2023
  • (2023)Gaze-based Interaction on Handheld Mobile DevicesProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589540(1-4)Online publication date: 30-May-2023
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media