Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3317958.3318228acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Iris: a tool for designing contextually relevant gaze visualizations

Published: 25 June 2019 Publication History
  • Get Citation Alerts
  • Abstract

    Advances in eye tracking technology have enabled new interaction techniques and gaze-based applications. However, the techniques for visualizing gaze information have remained relatively unchanged. We developed Iris, a tool to support the design of contextually relevant gaze visualizations. Iris allows users to explore displaying different features of gaze behavior including the current fixation point, duration, and saccades. Stylistic elements such as color, opacity, and smoothness can also be adjusted to give users creative and detailed control over the design of their gaze visualization. We present the Iris system and perform a user study to examine how participants can make use of the tool to devise contextually relevant gaze visualizations for a variety of collaborative tasks. We show that changes in color and opacity as well as variation in gaze trails can be adjusted to create meaningful gaze visualizations that fit the context of use.

    References

    [1]
    Ellen Gurman Bard, Robin L Hill, Mary Ellen Foster, and Manabu Arai. 2014. Tuning accessibility of referring expressions in situated dialogue. 29, 8 (2014), 928--949.
    [2]
    Roman Bednarik, Andrey Shipilov, and Sami Pietinen. 2011. Bidirectional gaze in remote computer mediated collaboration: Setup and initial results from pair-programming. In Proceedings of the ACM 2011 conference on Computer supported cooperative work. ACM, 597--600.
    [3]
    Susan E Brennan, Xin Chen, Christopher A Dickinson, Mark B Neider, and Gregory J Zelinsky. 2008. Coordinating cognition: The costs and benefits of shared gaze during collaborative search. Cognition 106, 3 (2008), 1465--1477.
    [4]
    Jeff Brewer, Sarah D'Angelo, and Darren Gergle. 2018. Iris: Gaze Visualization Design Made Easy. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, D504.
    [5]
    Sarah D'Angelo and Andrew Begel. 2017. Improving Communication with Shared Gaze Awareness in Remote Pair Programming. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM.
    [6]
    Sarah D'Angelo and Darren Gergle. 2016. Gazed and Confused: Understanding and Designing Shared Gaze for Remote Collaboration. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 2492--2496.
    [7]
    Sarah D'Angelo and Darren Gergle. 2018. An Eye For Design: Gaze Visualizations for Remote Collaborative Work. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM.
    [8]
    Kenneth Holmqvist, Marcus Nyström, and Fiona Mulvey. 2012. Eye tracker data quality: what it is and how to measure it. In Proceedings of the symposium on eye tracking research and applications. ACM, 45--52.
    [9]
    Patrick Jermann, Darren Gergle, Roman Bednarik, and Susan Brennan. 2012. Duet 2012: Workshop on dual eye tracking in CSCW. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work Companion. ACM, 23--24.
    [10]
    Jerry Li, Mia Manavalan, Sarah D'Angelo, and Darren Gergle. 2016. Designing Shared Gaze Awareness for Remote Collaboration. In Proceedings of the 19th ACM Conference on Computer Supported Cooperative Work and Social Computing Companion. ACM, 325--328.
    [11]
    Parag K Mital, Tim J Smith, Robin L Hill, and John M Henderson. 2011. Clustering of gaze during dynamic scene viewing is predicted by motion. Cognitive Computation 3, 1 (2011), 5--24.
    [12]
    Joshua Newn, Eduardo Velloso, Fraser Allison, Yomna Abdelrahman, and Frank Vetere. 2017. Evaluating Real-Time Gaze Representations to Infer Intentions in Competitive Turn-Based Strategy Games. In Proceedings of the Annual Symposium on Computer-Human Interaction in Play. ACM, 541--552.
    [13]
    Diederick C Niehorster, Tim HW Cornelissen, Kenneth Holmqvist, Ignace TC Hooge, and Roy S Hessels. 2017. What to expect from your remote eye-tracker when participants are unrestrained. Behavior Research Methods (2017), 1--15.
    [14]
    Pernilla Qvarfordt and Shumin Zhai. 2005. Conversing with the user based on eye-gaze patterns. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 221--230.
    [15]
    Bertrand Schneider and Roy Pea. 2013. Real-time mutual gaze perception enhances collaborative learning and collaboration quality. International Journal of Computer-supported collaborative learning 8, 4 (2013), 375--397.
    [16]
    Kshitij Sharma, Sarah D'Angelo, Darren Gergle, and Pierre Dillenbourg. 2016. Visual Augmentation of Deictic Gestures in MOOC videos. In 12th International Conference of the Learning Sciences (ICLS'16). ACM.
    [17]
    Srinivas Sridharan, Reynold Bailey, Ann McNamara, and Cindy Grimm. 2012. Subtle gaze manipulation for improved mammography training. In Proceedings of the symposium on eye tracking research and applications. ACM, 75--82.
    [18]
    Randy Stein and Susan E Brennan. 2004. Another person's eye gaze as a cue in solving programming problems. In Proceedings of the 6th international conference on Multimodal interfaces. ACM, 9--15.
    [19]
    Siliang Tang, Ronan G Reilly, and Christian Vorstius. 2012. EyeMap: a software system for visualizing and analyzing eye movement data in reading. Behavior research methods 44, 2 (2012), 420--438.
    [20]
    Yanxia Zhang, Ken Pfeuffer, Ming Ki Chong, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2017. Look together: using gaze for assisting co-located collaborative search. Personal and Ubiquitous Computing 21, 1 (2017), 173--186.

    Cited By

    View all
    • (2024)The Widening Gap: The Benefits and Harms of Generative AI for Novice ProgrammersProceedings of the 2024 ACM Conference on International Computing Education Research - Volume 110.1145/3632620.3671116(469-486)Online publication date: 12-Aug-2024
    • (2022)EyeBox: A Toolbox based on Python3 for Eye Movement AnalysisProcedia Computer Science10.1016/j.procs.2022.03.024201(166-173)Online publication date: 2022
    • (2021)Visualizing Prediction Correctness of Eye Tracking ClassifiersACM Symposium on Eye Tracking Research and Applications10.1145/3448018.3457997(1-7)Online publication date: 25-May-2021
    • Show More Cited By

    Index Terms

    1. Iris: a tool for designing contextually relevant gaze visualizations

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
      June 2019
      623 pages
      ISBN:9781450367097
      DOI:10.1145/3314111
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 25 June 2019

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. design
      2. eye-tracking
      3. gaze visualizations

      Qualifiers

      • Short-paper

      Conference

      ETRA '19

      Acceptance Rates

      Overall Acceptance Rate 69 of 137 submissions, 50%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)11
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 10 Aug 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)The Widening Gap: The Benefits and Harms of Generative AI for Novice ProgrammersProceedings of the 2024 ACM Conference on International Computing Education Research - Volume 110.1145/3632620.3671116(469-486)Online publication date: 12-Aug-2024
      • (2022)EyeBox: A Toolbox based on Python3 for Eye Movement AnalysisProcedia Computer Science10.1016/j.procs.2022.03.024201(166-173)Online publication date: 2022
      • (2021)Visualizing Prediction Correctness of Eye Tracking ClassifiersACM Symposium on Eye Tracking Research and Applications10.1145/3448018.3457997(1-7)Online publication date: 25-May-2021
      • (2019)Designing Interactions with Intention-Aware Gaze-Enabled Artificial AgentsHuman-Computer Interaction – INTERACT 201910.1007/978-3-030-29384-0_17(255-281)Online publication date: 2-Sep-2019

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media