Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2971648.2971679acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

TextPursuits: using text for pursuits-based interaction and calibration on public displays

Published: 12 September 2016 Publication History

Abstract

In this paper we show how reading text on large display can be used to enable gaze interaction in public space. Our research is motivated by the fact that much of the content on public displays includes text. Hence, researchers and practitioners could greatly benefit from users being able to spontaneously interact as well as to implicitly calibrate an eye tracker while simply reading this text. In particular, we adapt Pursuits, a technique that correlates users' eye movements with moving on-screen targets. While prior work used abstract objects or dots as targets, we explore the use of Pursuits with text (read-and-pursue). Thereby we address the challenge that eye movements performed while reading interfere with the pursuit movements. Results from two user studies (N=37) show that Pursuits with text is feasible and can achieve similar accuracy as non text-based pursuit approaches. While calibration is less accurate, it integrates smoothly with reading and allows areas of the display the user is looking at to be identified.

Supplementary Material

ZIP File (p274-khamis.zip)
Supplemental material.

References

[1]
Florian Alt, Jörg Müller, and Albrecht Schmidt. 2012a. Advertising on Public Display Networks. Computer 45, 5 (2012), 50--56.
[2]
Florian Alt, Stefan Schneegaß, Albrecht Schmidt, Jörg Müller, and Nemanja Memarovic. 2012b. How to Evaluate Public Displays. In Proc. of PerDis '12. ACM, New York, NY, USA.
[3]
Florian Alt, Alireza Sahami Shirazi, Thomas Kubitza, and Albrecht Schmidt. 2013. Interaction Techniques for Creating and Exchanging Content with Public Displays. In Proc. of CHI '13. ACM, New York, NY, USA, 1709--1718.
[4]
Florian Alt and Julia Vehns. 2016. Opportunistic Deployments: Challenges and Opportunities of Conducting Public Display Research at an Airport. In Proc. of PerDis'16. ACM, New York, NY, USA.
[5]
Ralf Biedert, Georg Buscher, Sven Schwarz, Jörn Hees, and Andreas Dengel. 2010. Text 2.0. In Proc. of CHI EA '10. ACM, New York, NY, USA, 4003--4008.
[6]
Feridun M. Celebi, Elizabeth S. Kim, Quan Wang, Carla A. Wall, and Frederick Shic. 2014. A Smooth Pursuit Calibration Technique. In Proc. of ETRA '14. ACM, New York, NY, USA, 377--378.
[7]
Ishan Chatterjee, Robert Xiao, and Chris Harrison. 2015. Gaze+Gesture: Expressive, Precise and Targeted Free-Space Interactions. In Proc. of ICMI '15. ACM, New York, NY, USA, 131--138.
[8]
Jorgos Coenen, Niels Wouters, and Andrew Vande Moere. 2016. Synchronized Signage on Multiple Consecutively Situated Public Displays. In Proc. of PerDis '16. ACM, New York, NY, USA.
[9]
Ashley Colley, Leena Ventä-Olkkonen, Florian Alt, and Jonna Häkkilä. 2015a. Insights from Deploying See-Through Augmented Reality Signage in the Wild. In Proc. of PerDis '15. ACM, New York, NY, USA, 179--185.
[10]
Ashley Colley, Lasse Virtanen, Jani Väyrynen, and Jonna Häkkilä. 2015b. Physically Guiding Touch Screen Interaction with Public Displays. In Proc. of PerDis '15. ACM, New York, NY, USA, 255--256.
[11]
Dietlind Helene Cymek, Antje Christine Venjakob, Stefan Ruff, Otto Hans-Martin Lutzand Simon Hofmann, and Matthias Roetting. 2014. Entering PIN codes by smooth pursuit eye movements. Journal of Eye Movement Research 7(4):1 (2014), 1--11.
[12]
Tilman Dingler, Tobias Bagg, Yves Grau, Niels Henze, and Albrecht Schmidt. 2015. uCanvas: A Web Framework for Spontaneous Smartphone Interaction with Ubiquitous Displays. In Human-Computer Interaction--INTERACT 2015. Springer, 402--409.
[13]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015a. Orbits: Enabling Gaze Interaction in Smart Watches Using Moving Targets. In Proc. of UbiComp/ISWC'15 Adjunct. ACM, New York, NY, USA, 419--422.
[14]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015b. Orbits: Gaze Interaction for Smart Watches Using Smooth Pursuit Eye Movements. In Proc. of UIST '15. ACM, New York, NY, USA, 457--466.
[15]
Luke Hespanhol, Martin Tomitsch, Ian McArthur, Joel Fredericks, Ronald Schroeter, and Marcus Foth. 2015a. Situated Interfaces for Engaging Citizens on the Go. interactions 23, 1 (Dec. 2015), 40--45.
[16]
Luke Hespanhol, Martin Tomitsch, Ian McArthur, Joel Fredericks, Ronald Schroeter, and Marcus Foth. 2015b. Vote as you go: Blending interfaces for community engagement into the urban space. In 7th International Conference on Communities and Technologies (C&T). ACM, University of Limerick, Limerick, Ireland, 29--37.
[17]
Simo Hosio, Jorge Goncalves, Vassilis Kostakos, and Jukka Riekki. 2014. Exploring Civic Engagement on Public Displays. In User-Centric Technology Design for Nonprofit and Civic Engagements. Springer, 91--111.
[18]
Robert J. K. Jacob. 1991. The Use of Eye Movements in Human-computer Interaction Techniques: What You Look at is What You Get. ACM Trans. Inf. Syst. 9, 2 (April 1991), 152--169.
[19]
Jari Kangas, Oleg Špakov, Poika Isokoski, Deepak Akkil, Jussi Rantala, and Roope Raisamo. 2016. Feedback for Smooth Pursuit Gaze Tracking Based Control. In Proc. of AH '16. ACM, New York, NY, USA, Article 6, 8 pages.
[20]
Mohamed Khamis, Florian Alt, and Andreas Bulling. 2015. A Field Study on Spontaneous Gaze-based Interaction with a Public Display Using Pursuits. In Adjunct Proc. of UbiComp '15. ACM, New York, NY, USA, 863--872.
[21]
Mohamed Khamis, Florian Alt, and Andreas Bulling. 2016. Challenges and Design Space of Gaze-enabled Public Displays. In Ajunct Proc. of UbiComp '16. ACM, New York, NY, USA, 10.
[22]
Mohamed Khamis, Andreas Bulling, and Florian Alt. 2015. Tackling Challenges of Interactive Public Displays Using Gaze. In Adjunct Proc. of UbiComp '15. ACM, New York, NY, USA, 763--766.
[23]
R. Kishi and T. Hayashi. 2015. Effective gazewriting with support of text copy and paste. In Computer and Information Science (ICIS), 2015 IEEE/ACIS 14th International Conference on. 125--130.
[24]
Lisa Koeman, Vaiva Kalnikaité, and Yvonne Rogers. 2015. "Everyone Is Talking About It!": A Distributed Approach to Urban Voting Technology and Visualisations. In Proc. of CHI '15. ACM, New York, NY, USA, 3127--3136.
[25]
Manu Kumar, Andreas Paepcke, and Terry Winograd. 2007. EyePoint: Practical Pointing and Selection Using Gaze and Keyboard. In Proc. of CHI '07. ACM, New York, NY, USA, 421--430.
[26]
Kai Kunze, Katsutoshi Masai, Masahiko Inami, Ömer Sacakli, Marcus Liwicki, Andreas Dengel, Shoya Ishimaru, and Koichi Kise. 2015. Quantifying Reading Habits: Counting How Many Words You Read. In Proc. of UbiComp '15. ACM, New York, NY, USA, 87--96.
[27]
Christian Lander, Sven Gehring, Antonio Krüger, Sebastian Boring, and Andreas Bulling. 2015. GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays. In Proc. of UIST '15. ACM, New York, NY, USA, 395--404.
[28]
Dachuan Liu, Bo Dong, Xing Gao, and Haining Wang. 2015. Exploiting Eye Tracking for Smartphone Authentication. In Proc. of ACNS '15. 20.
[29]
Otto Hans-Martin Lutz, Antje Christine Venjakob, and Stefan Ruff. 2015. SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements. Journal of Eye Movement Research 8(1):2 (2015), 1--11.
[30]
Päivi Majaranta. 2012. Communication and Text Entry by Gaze. In Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies. IGI Global, 63--77.
[31]
Päivi Majaranta and Andreas Bulling. 2014. Eye Tracking and Eye-Based Human-Computer Interaction. In Advances in Physiological Computing. Springer, 39--65.
[32]
Paul Marshall, Richard Morris, Yvonne Rogers, Stefan Kreitmayer, and Matt Davies. 2011. Rethinking 'Multi-user': An In-the-wild Study of How Groups Approach a Walk-up-and-use Tabletop Interface. In Proc. of CHI '11. ACM, New York, NY, USA, 3033--3042.
[33]
Jörg Müller, Florian Alt, Daniel Michelis, and Albrecht Schmidt. 2010. Requirements and Design Space for Interactive Public Displays. In Proc. of MM '10. ACM, New York, NY, USA, 1285--1294.
[34]
Jörg Müller, Robert Walter, Gilles Bailly, Michael Nischt, and Florian Alt. 2012. Looking Glass: A Field Study on Noticing Interactivity of a Shop Window. In Proc. of CHI '12. ACM, New York, NY, USA, 297--306.
[35]
Erica C. Ostermann, Long Ma, Daniel Sussman, and Susan R. Fussell. 2015. CommunityConnect: An Interactive Display for Educational Residential Settings. In Proc. of CSCW '15 Companion. ACM, New York, NY, USA, 175--178.
[36]
Ken Pfeuffer, Mélodie Vidal, Jayson Turner, Andreas Bulling, and Hans Gellersen. 2013. Pursuit Calibration: Making Gaze Calibration Less Tedious and More Flexible. In Proc. of UIST '13. ACM, New York, NY, USA, 261--270.
[37]
Ken Pfeuffer, Yanxia Zhang, and Hans Gellersen. 2015. A Collaborative Gaze Aware Information Display. In Adjunct Proc. of UbiComp '15. ACM, New York, NY, USA, 389--391.
[38]
Selina Sharmin, Oleg Špakov, and Kari-Jouko Räihä. 2013. Reading On-screen Text with Gaze-based Auto-scrolling. In Proc. of ETSA '13. ACM, New York, NY, USA, 24--31.
[39]
Fabius Steinberger, Marcus Foth, and Florian Alt. 2014. Vote With Your Feet: Local Community Polling on Urban Screens. In Proc. of PerDis '14. ACM, New York, NY, USA, 44:44--44:49.
[40]
Sophie Stellmach and Raimund Dachselt. 2012. Look & Touch: Gaze-supported Target Acquisition. In Proc. of CHI '12. ACM, New York, NY, USA, 2981--2990.
[41]
Sophie Stellmach and Raimund Dachselt. 2013. Still Looking: Investigating Seamless Gaze-supported Selection, Positioning, and Manipulation of Distant Targets. In Proc. of CHI '13. ACM, New York, NY, USA, 285--294.
[42]
Nick Taylor, Justin Marshall, Alicia Blum-Ross, John Mills, Jon Rogers, Paul Egglestone, David M. Frohlich, Peter Wright, and Patrick Olivier. 2012. Viewpoint: Empowering Communities with Situated Voting Devices. In Proc. of CHI '12. ACM, New York, NY, USA, 1361--1370.
[43]
Takumi Toyama, Daniel Sonntag, Andreas Dengel, Takahiro Matsuda, Masakazu Iwamura, and Koichi Kise. 2014. A Mixed Reality Head-mounted Text Translation System Using Eye Gaze Input. In Proc. of IUI '14. ACM, New York, NY, USA, 329--334.
[44]
Nina Valkanova, Robert Walter, Andrew Vande Moere, and Jörg Müller. 2014. MyPosition: Sparking Civic Discourse by a Public Interactive Poll Visualization. In Proc. of CSCW '14. ACM, New York, NY, USA, 1323--1332.
[45]
Eduardo Velloso, Markus Wirth, Christian Weichel, Augusto Esteves, and Hans Gellersen. 2016. AmbiGaze: Direct Control of Ambient Devices by Gaze. In Proc. DIS '16. ACM, New York, NY, USA, 812--817.
[46]
Roel Vertegaal. 2003. Attentive user interfaces. Commun. ACM 46, 3 (2003), 30--33.
[47]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proc. of UbiComp '13. ACM, New York, NY, USA, 439--448.
[48]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2015. Pursuits: Spontaneous Eye-Based Interaction for Dynamic Interfaces. GetMobile: Mobile Comp. and Comm. 18, 4 (Jan. 2015), 8--10.
[49]
Mélodie Vidal, Ken Pfeuffer, Andreas Bulling, and Hans W. Gellersen. 2013. Pursuits: Eye-based Interaction with Moving Targets. In Proc. of CHI EA '13. ACM, New York, NY, USA, 3147--3150.
[50]
Vasilis Vlachokyriakos, Rob Comber, Karim Ladha, Nick Taylor, Paul Dunphy, Patrick McCorry, and Patrick Olivier. 2014. PosterVote: Expanding the Action Repertoire for Local Political Activism. In Proc. of DIS '14. ACM, New York, NY, USA, 795--804.
[51]
Oleg Špakov, Poika Isokoski, Jari Kangas, Deepak Akkil, and Päivi Majaranta. 2016. PursuitAdjuster: An Exploration into the Design Space of Smooth Pursuit--based Widgets. In Proc. of ETRA '16. ACM, New York, NY, USA, 287--290.
[52]
Xuehan Xiong, Zicheng Liu, Qin Cai, and Zhengyou Zhang. 2014. Eye Gaze Tracking Using an RGBD Camera: A Comparison with a RGB Solution. In Adjunct Proc. of UbiComp '14. ACM, New York, NY, USA, 1113--1121.
[53]
Yanxia Zhang, Andreas Bulling, and Hans Gellersen. 2013. SideWays: A Gaze Interface for Spontaneous Interaction with Situated Displays. In Proc. of CHI '13. ACM, New York, NY, USA, 851--860.
[54]
Yanxia Zhang, Andreas Bulling, and Hans Gellersen. 2014. Pupil-canthi-ratio: a calibration-free method for tracking horizontal gaze direction. In Proc. of AVI '14. ACM, New York, NY, USA, 129--132.
[55]
Yanxia Zhang, MingKi Chong, Jörg Müller, Andreas Bulling, and Hans Gellersen. 2015. Eye tracking for public displays in the wild. Personal and Ubiquitous Computing 19, 5-6 (2015), 967--981.
[56]
Yanxia Zhang, Jörg Müller, Ming Ki Chong, Andreas Bulling, and Hans Gellersen. 2014. GazeHorizon: Enabling Passers-by to Interact with Public Displays by Gaze. In Proc. of UbiComp '14. ACM, New York, NY, USA, 559--563.

Cited By

View all
  • (2024)Navigating the Digital Public Sphere: An AI-Driven Analysis of Interaction Dynamics across Societal DomainsSocieties10.3390/soc1410019514:10(195)Online publication date: 26-Sep-2024
  • (2024)Robust Object Selection in Spontaneous Gaze-Controlled Application Using Exponential Moving Average and Hidden Markov ModelIEEE Transactions on Human-Machine Systems10.1109/THMS.2024.341378154:5(485-498)Online publication date: Oct-2024
  • (2024)Enhancing Fixation and Pursuit: Optimizing Field of View and Number of Targets for Selection Performance in Virtual RealityInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2313888(1-13)Online publication date: 15-Feb-2024
  • Show More Cited By

Index Terms

  1. TextPursuits: using text for pursuits-based interaction and calibration on public displays

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UbiComp '16: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing
    September 2016
    1288 pages
    ISBN:9781450344616
    DOI:10.1145/2971648
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 12 September 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. gaze interaction
    2. public displays
    3. smooth pursuit
    4. text

    Qualifiers

    • Research-article

    Conference

    UbiComp '16

    Acceptance Rates

    UbiComp '16 Paper Acceptance Rate 101 of 389 submissions, 26%;
    Overall Acceptance Rate 764 of 2,912 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)23
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 04 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Navigating the Digital Public Sphere: An AI-Driven Analysis of Interaction Dynamics across Societal DomainsSocieties10.3390/soc1410019514:10(195)Online publication date: 26-Sep-2024
    • (2024)Robust Object Selection in Spontaneous Gaze-Controlled Application Using Exponential Moving Average and Hidden Markov ModelIEEE Transactions on Human-Machine Systems10.1109/THMS.2024.341378154:5(485-498)Online publication date: Oct-2024
    • (2024)Enhancing Fixation and Pursuit: Optimizing Field of View and Number of Targets for Selection Performance in Virtual RealityInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2313888(1-13)Online publication date: 15-Feb-2024
    • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
    • (2024)Design recommendations of target size and tracking speed under circular and square trajectories for smooth pursuit with Euclidean algorithm in eye-control systemDisplays10.1016/j.displa.2023.10260881(102608)Online publication date: Jan-2024
    • (2024)Using Cockpit Interactions for Implicit Eye-Tracking Calibration in a Flight SimulatorComputer Vision, Imaging and Computer Graphics Theory and Applications10.1007/978-3-031-66743-5_12(256-270)Online publication date: 22-Aug-2024
    • (2023)Gaze-based Interaction on Handheld Mobile DevicesProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589540(1-4)Online publication date: 30-May-2023
    • (2023)Event-Based Pupil Tracking Using Bright and Dark Pupil EffectAdjunct Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586182.3616657(1-3)Online publication date: 29-Oct-2023
    • (2023)Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580871(1-17)Online publication date: 19-Apr-2023
    • (2022)User Perception of Smooth Pursuit Target Speed2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3529234(1-7)Online publication date: 8-Jun-2022
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media