Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3588015.3589844acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Gazealytics: A Unified and Flexible Visual Toolkit for Exploratory and Comparative Gaze Analysis

Published: 30 May 2023 Publication History

Abstract

We present a novel, web-based visual eye-tracking analytics tool called Gazealytics. Our open-source toolkit features a unified combination of gaze analytics features that support flexible exploratory analysis, along with annotation of areas of interest (AOI) and filter options based on multiple criteria to visually analyse eye tracking data across time and space. Gazealytics features coordinated views unifying spatiotemporal exploration of fixations and scanpaths for various analytical tasks. A novel matrix representation allows analysis of relationships between such spatial or temporal features. Data can be grouped across samples, user-defined AOIs or time windows of interest (TWIs) to support aggregate or filtered analysis of gaze activity. This approach exceeds the capabilities of existing systems by supporting flexible comparison between and within subjects, hypothesis generation, data analysis and communication of insights. We demonstrate in a walkthrough that Gazealytics supports multiple types of eye tracking datasets and analytical tasks.

References

[1]
Umair Afzal, Arnaud Prouzeau, Lee Lawrence, Tim Dwyer, Saikiranrao Bichinepally, Ariel Liebman, and Sarah Goodwin. 2022. Investigating Cognitive Load in Energy Network Control Rooms: Recommendations for Future Designs. Frontiers in Psychology 13 (2022).
[2]
Michael Behrisch, Benjamin Bach, Nathalie Henry Riche, Tobias Schreck, and Jean-Daniel Fekete. 2016. Matrix Reordering Methods for Table and Network Visualization. Computer Graphics Forum 35, 3 (2016), 693–716.
[3]
T. Blascheck, M. John, K. Kurzhals, S. Koch, and T. Ertl. 2016. VA2: A Visual Analytics Approach for Evaluating Visual Analytics Applications. IEEE Transactions on Visualization and Computer Graphics 22, 1 (2016), 61–70.
[4]
T. Blascheck, K. Kurzhals, M. Raschke, M. Burch, D. Weiskopf, and T. Ertl. 2014. State-of-the-Art of Visualization for Eye Tracking Data. In EuroVis – STARs, R. Borgo, R. Maciejewski, and I. Viola (Eds.). The Eurographics Association, 63–82.
[5]
T. Blascheck, K. Kurzhals, M. Raschke, M. Burch, D. Weiskopf, and T. Ertl. 2017. Visualization of Eye Tracking Data: A Taxonomy and Survey. Computer Graphics Forum 36, 8 (2017), 260–284.
[6]
Matthew Brehmer and Tamara Munzner. 2013. A Multi-Level Typology of Abstract Visualization Tasks. IEEE Transactions on Visualization and Computer Graphics 19, 12 (2013), 2376–2385.
[7]
Michael Burch. 2022. Eye Tracking and Visual Analytics. CRC Press.
[8]
Michael Burch, Ayush Kumar, and Neil Timmermans. 2019. An Interactive Web-Based Visual Analytics Tool for Detecting Strategic Eye Movement Patterns. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. Article 93, 5 pages.
[9]
Michael Burch, Kuno Kurzhals, Niklas Kleinhans, and Daniel Weiskopf. 2018. EyeMSA: Exploring Eye Movement Data with Pairwise and Multiple Sequence Alignment. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. Article 52, 5 pages.
[10]
Minghao Cai, Bin Zheng, and Carrie Demmans Epp. 2022. Towards Supporting Adaptive Training of Injection Procedures: Detecting Differences in the Visual Attention of Nursing Students and Experts. In Proceedings of the 30th ACM Conference on User Modeling, Adaptation and Personalization. 286–294.
[11]
Chunlei Chang, Benjamin Bach, Tim Dwyer, and Kim Marriott. 2017. Evaluating Perceptually Complementary Views for Network Exploration Tasks. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 1397–1407.
[12]
Chunlei Chang, Tim Dwyer, and Kim Marriott. 2018. An Evaluation of Perceptually Complementary Views for Multivariate Data. In Proceedings of the 2018 IEEE Pacific Visualization Symposium (PacificVis). IEEE, 195–204.
[13]
Kun-Ting Chen. 2022. It’s a Wrap! Visualisations that Wrap Around Cylindrical, Toroidal, or Spherical Topologies. arXiv preprint arXiv:2209.13251 (2022).
[14]
Kun-Ting Chen, Tim Dwyer, Kim Marriott, and Benjamin Bach. 2020. DoughNets: Visualising Networks Using Torus Wrapping. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. Article 53, 11 pages.
[15]
Kun-Ting Chen, Quynh Quang Ngo, Kuno Kurzhals, Kim Marriott, Tim Dwyer, Michael Sedlmair, and Daniel Weiskopf. 2023. Reading Strategies for Graph Visualizations that Wrap Around in Torus Topology. arxiv:2303.17066 [cs.HC]
[16]
Christophe Hurter, Alexandru Telea, and Ozan Ersoy. 2011. MoleView: An Attribute and Structure-Based Semantic Lens for Large Element-Based Plots. IEEE Transactions on Visualization and Computer Graphics 17, 12 (2011), 2600–2609.
[17]
Ayush Kumar, Neil Timmermans, Michael Burch, and Klaus Mueller. 2019. Clustered Eye Movement Similarity Matrices. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. Article 82, 9 pages.
[18]
Kuno Kurzhals, Michael Burch, Tanja Blascheck, Gennady Andrienko, Natalia Andrienko, and Daniel Weiskopf. 2017. A Task-Based View on the Visual Analysis of Eye-Tracking Data. In Eye Tracking and Visualization, Michael Burch, Lewis Chuang, Brian Fisher, Albrecht Schmidt, and Daniel Weiskopf (Eds.). Springer International Publishing, Cham, 3–22.
[19]
Kuno Kurzhals, Brian Fisher, Michael Burch, and Daniel Weiskopf. 2016. Eye tracking evaluation of visual analytics. Information Visualization 15, 4 (2016), 340–358.
[20]
Kuno Kurzhals, Marcel Hlawatsch, Christof Seeger, and Daniel Weiskopf. 2016. Visual Analytics for Mobile Eye Tracking. IEEE Transactions on Visualization and Computer Graphics 23, 1 (2016), 301–310.
[21]
Tiffany C. K. Kwok, Peter Kiefer, Victor R. Schinazi, Benjamin Adams, and Martin Raubal. 2019. Gaze-Guided Narratives: Adapting Audio Guide Content to Gaze in Virtual and Real Environments. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. Article 491, 12 pages.
[22]
Raphael Menges, Sophia Kramer, Stefan Hill, Marius Nisslmueller, Chandan Kumar, and Steffen Staab. 2020. A Visualization Tool for Eye Tracking Data Analysis in the Web. In Proceedings of the ACM Symposium on Eye Tracking Research and Applications (ETRA ’20 Short Papers). Article 46, 5 pages.
[23]
Saul B. Needleman and Christian D. Wunsch. 1970. A General Method Applicable to the Search for Similarities in the Amino Acid Sequence of Two Proteins. Journal of Molecular Biology 48, 3 (1970), 443–453.
[24]
Karen Panetta, Qianwen Wan, Srijith Rajeev, Aleksandra Kaszowska, Aaron L Gardony, Kevin Naranjo, Holly A Taylor, and Sos Agaian. 2020. ISeeColor: Method for Advanced Visual Analytics of Eye Tracking Data. IEEE Access 8 (2020), 52278–52287.
[25]
Alex Poole and Linden J. Ball. 2006. Eye Tracking in HCI and Usability Research. In Encyclopedia of Human Computer Interaction. IGI global, 211–219.
[26]
Stanislav Pozdniakov, Roberto Martinez-Maldonado, Yi-Shan Tsai, Vanessa Echeverria, Namrata Srivastava, and Dragan Gasevic. 2023. How Do Teachers Use Dashboards Enhanced with Data Storytelling Elements According to their Data Visualisation Literacy Skills?. In LAK23: 13th International Learning Analytics and Knowledge Conference. 89–99.
[27]
Anaïs Servais, Christophe Hurter, and Emmanuel J. Barbeau. 2022. Attentional switch to memory: an early and critical phase of the cognitive cascade allowing autobiographical memory retrieval. https://doi.org/10.31234/osf.io/z32qe PsyArXiv.
[28]
Harri Siirtola, Tuuli Laivo, Tomi Heimonen, and Kari-Jouko Räihä. 2009. Visual Perception of Parallel Coordinate Visualizations. In Proceedings of the 2009 13th International Conference Information Visualisation. IEEE, 3–9.
[29]
Yao Wang, Mihai Bâce, and Andreas Bulling. 2021. Scanpath Prediction on Information Visualisations. arXiv preprint arXiv:2112.02340 (2021).
[30]
Yao Wang, Maurice Koch, Mihai Bâce, Daniel Weiskopf, and Andreas Bulling. 2022. Impact of Gaze Uncertainty on AOIs in Information Visualisations. In Proceedings of the 2022 Symposium on Eye Tracking Research and Applications. Article 60, 6 pages.

Cited By

View all
  • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
  • (2024)VisRecall++: Analysing and Predicting Visualisation Recallability from Gaze BehaviourProceedings of the ACM on Human-Computer Interaction10.1145/36556138:ETRA(1-18)Online publication date: 28-May-2024
  • (2024)Which Experimental Design is Better Suited for VQA Tasks?: Eye Tracking Study on Cognitive Load, Performance, and Gaze AllocationsProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653519(1-7)Online publication date: 4-Jun-2024

Index Terms

  1. Gazealytics: A Unified and Flexible Visual Toolkit for Exploratory and Comparative Gaze Analysis

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ETRA '23: Proceedings of the 2023 Symposium on Eye Tracking Research and Applications
      May 2023
      441 pages
      ISBN:9798400701504
      DOI:10.1145/3588015
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 30 May 2023

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Eye tracking
      2. area of interest
      3. group-level visualisation
      4. matrix-based overview
      5. time window of interest
      6. visual analytics

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Conference

      ETRA '23

      Acceptance Rates

      Overall Acceptance Rate 69 of 137 submissions, 50%

      Upcoming Conference

      ETRA '25

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)61
      • Downloads (Last 6 weeks)6
      Reflects downloads up to 20 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
      • (2024)VisRecall++: Analysing and Predicting Visualisation Recallability from Gaze BehaviourProceedings of the ACM on Human-Computer Interaction10.1145/36556138:ETRA(1-18)Online publication date: 28-May-2024
      • (2024)Which Experimental Design is Better Suited for VQA Tasks?: Eye Tracking Study on Cognitive Load, Performance, and Gaze AllocationsProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653519(1-7)Online publication date: 4-Jun-2024

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media