Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3481549.3481562acmotherconferencesArticle/Chapter ViewAbstractPublication PagesvinciConference Proceedingsconference-collections
short-paper

EyeFIX: An Interactive Visual Analytics Interface for Eye Movement Analysis

Published: 27 November 2021 Publication History

Abstract

Eye movements are closely related to the cognitive processes and often act as a window to the brain and mind. To facilitate a window to the brain using eye movements, we propose EyeFIX, an interactive visual analytics interface. Our interface currently focuses on processing gaze movements to generate the two most prominent events, fixations, and saccades which are used in a large part of the eye movement literature. Our interface has multiple interactive widgets that allow the users to choose the algorithm and tune its hyper parameters that control how fixations and saccades are generated. It has four major visualizations: (1) A Gaze Plot to showcase the gaze movement of each participant, (2) a Fixation Plot that helps to identify regions of the stimulus which are of interest to a particular participant, (3) a Heat Map to study the denser regions of fixation and regions which the participant frequently visited, and (4) a Timeline Visualization to assist in having a closer look at temporal regions of interest. We design a set of brushing and linking operators that allow users to interactively control all the visualizations to dig deeper in both, the spatial and temporal domains. We demonstrate the applicability of our interface with datasets obtained by human subjects viewing naturalistic stimuli while performing various viewing tasks, including visual exploration, visual search, and prolonged visual fixation.

References

[1]
Gennady Andrienko, Natalia Andrienko, Michael Burch, and Daniel Weiskopf. 2012. Visual Analytics Methodology for Eye Movement Studies. IEEE Transactions on Visualization and Computer Graphics 18, 12(2012), 2889–2898.
[2]
Tanja Blascheck, Michael Burch, Michael Raschke, and Daniel Weiskopf. 2015. Challenges and Perspectives in Big Eye-Movement Data Visual Analytics. In Proceedings of the 1st International Symposium on Big Data Visual Analytics. 17–24.
[3]
Michael Burch. 2017. Which Symbols, Features, and Regions Are Visually Attended in Metro Maps?. In Proceedings of the 9th KES International Conference on Intelligent Decision Technologies (KES-IDT(Smart Innovation, Systems and Technologies, Vol. 73), Ireneusz Czarnowski, Robert J. Howlett, and Lakhmi C. Jain(Eds.). Springer, 237–246.
[4]
Michael Burch, Gennady L. Andrienko, Natalia V. Andrienko, Markus Höferlin, Michael Raschke, and Daniel Weiskopf. 2013. Visual task solution strategies in tree diagrams. In Proceedings of IEEE Pacific Visualization Symposium, PacificVis, Sheelagh Carpendale, Wei Chen, and Seok-Hee Hong (Eds.). IEEE Computer Society, 169–176.
[5]
Michael Burch, Andreas Kull, and Daniel Weiskopf. 2013. AOI Rivers for Visualizing Dynamic Eye Gaze Frequencies. Computer Graphics Forum 32, 3 (2013), 281–290.
[6]
Michael Burch, Kuno Kurzhals, Niklas Kleinhans, and Daniel Weiskopf. 2018. EyeMSA: exploring eye movement data with pairwise and multiple sequence alignment. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, ETRA. 1–5.
[7]
Christine Carl, Alper Açık, Peter König, Andreas K Engel, and Joerg F Hipp. 2012. The saccadic spike artifact in MEG. Neuroimage 59, 2 (2012), 1657–1667.
[8]
Andrew T. Duchowski. 2003. Eye Tracking Methodology - Theory and Practice. Springer.
[9]
Joseph H. Goldberg and Jonathan I. Helfman. 2010. Visual scanpath representation. In Proceedings of the Symposium on Eye-Tracking Research and Applications (ETRA). 203–210.
[10]
Kenneth Holmqvist, Marcus Nyström, Richard Andersson, Richard Dewhurst, Halszka Jarodzka, and Joost van de Weijer. 2011. Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press.
[11]
Ayush Kumar, Michael Burch, and Klaus Mueller. 2019. Visually Comparing Eye Movements over Space and Time. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications (Denver, Colorado) (ETRA ’19). Association for Computing Machinery, New York, NY, USA, Article 81, 9 pages.
[12]
Ayush Kumar, Rudolf Netzel, Michael Burch, Daniel Weiskopf, and Klaus Mueller. 2018. Visual Multi-Metric Grouping of Eye-Tracking Data. Journal of Eye Movement Research 10, 5 (2018). https://doi.org/10.16910/jemr.10.5.10
[13]
Jorge Otero-Millan, Xoana G Troncoso, Stephen L Macknik, Ignacio Serrano-Pedraza, and Susana Martinez-Conde. 2008. Saccades and microsaccades during visual fixation, exploration, and search: foundations for a common saccadic generator. Journal of vision 8, 14 (2008), 21–21.
[14]
Michael Plöchl, José Pablo Ossandón, and Peter König. 2012. Combining EEG and eye tracking: identification, characterization, and correction of eye movement artifacts in electroencephalographic data. Frontiers in human neuroscience 6 (2012), 278.
[15]
Ruth Rosenholtz, Yuanzhen Li, Jonathan Mansfield, and Zhenlan Jin. 2005. Feature Congestion: A Measure of Display Clutter. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 761–770.
[16]
Dario D Salvucci and Joseph H Goldberg. 2000. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the 2000 symposium on Eye tracking research & applications. 71–78.

Index Terms

  1. EyeFIX: An Interactive Visual Analytics Interface for Eye Movement Analysis
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      VINCI '21: Proceedings of the 14th International Symposium on Visual Information Communication and Interaction
      September 2021
      139 pages
      ISBN:9781450386470
      DOI:10.1145/3481549
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 27 November 2021

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Cognition
      2. Eye movements
      3. Fixation data
      4. Visual analytics
      5. Visualization

      Qualifiers

      • Short-paper
      • Research
      • Refereed limited

      Conference

      VINCI 2021

      Acceptance Rates

      Overall Acceptance Rate 71 of 193 submissions, 37%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 93
        Total Downloads
      • Downloads (Last 12 months)23
      • Downloads (Last 6 weeks)3
      Reflects downloads up to 26 Jan 2025

      Other Metrics

      Citations

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media