For more than twenty years, the ACM ETRA conference has been the premier world-wide meeting place for the eye tracking community. ETRA is growing and changing together with the eye tracking research field. For the first time this year, ETRA is being held annually after being biannual since its inception.
Deep learning investigation for chess player attention prediction using eye-tracking and game data
This article reports on an investigation of the use of convolutional neural networks to predict the visual attention of chess players. The visual attention model described in this article has been created to generate saliency maps that capture ...
Semantic gaze labeling for human-robot shared manipulation
Human-robot collaboration systems benefit from recognizing people's intentions. This capability is especially useful for collaborative manipulation applications, in which users operate robot arms to manipulate objects. For collaborative manipulation, ...
EyeFlow: pursuit interactions using an unmodified camera
We investigate the smooth pursuit eye movement based interaction using an unmodified off-the-shelf RGB camera. In each pair of sequential video frames, we compute the indicative direction of the eye movement by analyzing flow vectors obtained using the ...
Exploring simple neural network architectures for eye movement classification
Analysis of eye-gaze is a critical tool for studying human-computer interaction and visualization. Yet eye tracking systems only report eye-gaze on the scene by producing large volumes of coordinate time series data. To be able to use this data, we must ...
Analyzing gaze transition behavior using bayesian mixed effects Markov models
The complex stochastic nature of eye tracking data calls for exploring sophisticated statistical models to ensure reliable inference in multi-trial eye-tracking experiments. We employ a Bayesian semi-parametric mixed-effects Markov model to compare gaze ...
Gaze behaviour on interacted objects during hand interaction in virtual reality for eye tracking calibration
In this paper, we investigate the probability and timing of attaining gaze fixations on interacted objects during hand interaction in virtual reality, with the main purpose for implicit and continuous eye tracking re-calibration. We conducted an ...
Time- and space-efficient eye tracker calibration
One of the obstacles to bring eye tracking technology to everyday human computer interactions is the time consuming calibration procedure. In this paper we investigate a novel calibration method based on smooth pursuit eye movement. The method uses ...
Task-embedded online eye-tracker calibration for improving robustness to head motion
Remote eye trackers are widely used for screen-based interactions. They are less intrusive than head mounted eye trackers, but are generally quite sensitive to head movement. This leads to the requirement for frequent recalibration, especially in ...
Reducing calibration drift in mobile eye trackers by exploiting mobile phone usage
Automatic saliency-based recalibration is promising for addressing calibration drift in mobile eye trackers but existing bottom-up saliency methods neglect user's goal-directed visual attention in natural behaviour. By inspecting real-life recordings of ...
Aiming for the quiet eye in biathlon
The duration of the so-called "Quiet Eye" (QE) - the final fixation before the initiation of a critical movement - seems to be linked to better perceptual-motor performances in various domains. For instance, experts show longer QE durations when ...
Eye tracking support for visual analytics systems: foundations, current applications, and research challenges
Visual analytics (VA) research provides helpful solutions for interactive visual data analysis when exploring large and complex datasets. Due to recent advances in eye tracking technology, promising opportunities arise to extend these traditional VA ...
Space-time volume visualization of gaze and stimulus
We present a method for the spatio-temporal analysis of gaze data from multiple participants in the context of a video stimulus. For such data, an overview of the recorded patterns is important to identify common viewing behavior (such as attentional ...
Using developer eye movements to externalize the mental model used in code summarization tasks
Eye movements of developers are used to speculate the mental cognition model (i.e., bottom-up or top-down) applied during program comprehension tasks. The cognition models examine how programmers understand source code by describing the temporary ...
Visually analyzing eye movements on natural language texts and source code snippets
In this paper, we analyze eye movement data of 26 participants using a quantitative and qualitative approach to investigate how people read natural language text in comparison to source code. In particular, we use the radial transition graph ...
Classification of strategies for solving programming problems using AoI sequence analysis
This eye tracking study examines participants' visual attention when solving algorithmic problems in the form of programming problems. The stimuli consisted of a problem statement, example output, and a set of multiple-choice questions regarding ...
Towards a low cost and high speed mobile eye tracker
Despite recent developments in eye tracking technology, mobile eye trackers (ET) are still expensive devices limited to a few hundred samples per second. High speed ETs (closer to 1 KHz) can provide improved flexibility for data filtering and more ...
Get a grip: slippage-robust and glint-free gaze estimation for real-time pervasive head-mounted eye tracking
A key assumption conventionally made by flexible head-mounted eye-tracking systems is often invalid: The eye center does not remain stationary w.r.t. the eye camera due to slippage. For instance, eye-tracker slippage might happen due to head ...
Getting (more) real: bringing eye movement classification to HMD experiments with equirectangular stimuli
The classification of eye movements is a very important part of eye tracking research and has been studied since its early days. Over recent years, we have experienced an increasing shift towards more immersive experimental scenarios with the use of eye-...
Power-efficient and shift-robust eye-tracking sensor for portable VR headsets
Photosensor oculography (PSOG) is a promising solution for reducing the computational requirements of eye tracking sensors in wireless virtual and augmented reality platforms. This paper proposes a novel machine learning-based solution for addressing ...
Monocular gaze depth estimation using the vestibulo-ocular reflex
Gaze depth estimation presents a challenge for eye tracking in 3D. This work investigates a novel approach to the problem based on eye movement mediated by the vestibulo-ocular reflex (VOR). VOR stabilises gaze on a target during head movement, with eye ...
Characterizing joint attention behavior during real world interactions using automated object and gaze detection
- Pranav Venuprasad,
- Tushal Dobhal,
- Anurag Paul,
- Tu N. M. Nguyen,
- Andrew Gilman,
- Pamela Cosman,
- Leanne Chukoskie
Joint attention is an essential part of the development process of children, and impairments in joint attention are considered as one of the first symptoms of autism. In this paper, we develop a novel technique to characterize joint attention in real ...
A novel gaze event detection metric that is not fooled by gaze-independent baselines
Eye movement classification algorithms are typically evaluated either in isolation (in terms of absolute values of some performance statistic), or in comparison to previously introduced approaches. In contrast to this, we first introduce and thoroughly ...
A fast approach to refraction-aware eye-model fitting and gaze prediction
By temporally integrating information about pupil contours extracted from eye images, model-based methods for glint-free gaze estimation can mitigate pupil detection noise. However, current approaches require time-consuming iterative solving of a ...
Screen corner detection using polarization camera for cross-ratio based gaze estimation
Eye tracking, which measures line of sight, is expected to advance as an intuitive and rapid input method for user interfaces, and a cross-ratio based method that calculates the point-of-gaze using homography matrices has attracted attention because it ...
Guiding gaze: expressive models of reading and face scanning
We evaluate subtle, emotionally-driven models of eye movement animation. Two models are tested, reading and face scanning, each based on recorded gaze transition probabilities. For reading, simulated emotional mood is governed by the probability density ...
PrivacEye: privacy-preserving head-mounted eye tracking using egocentric scene image and eye movement features
Eyewear devices, such as augmented reality displays, increasingly integrate eye tracking, but the first-person camera required to map a user's gaze to the visual scene can pose a significant threat to user and bystander privacy. We present PrivacEye, a ...
Privacy-aware eye tracking using differential privacy
With eye tracking being increasingly integrated into virtual and augmented reality (VR/AR) head-mounted displays, preserving users' privacy is an ever more important, yet under-explored, topic in the eye tracking community. We report a large-scale ...
Differential privacy for eye-tracking data
As large eye-tracking datasets are created, data privacy is a pressing concern for the eye-tracking community. De-identifying data does not guarantee privacy because multiple datasets can be linked for inferences. A common belief is that aggregating ...
Just gaze and wave: exploring the use of gaze and gestures for shoulder-surfing resilient authentication
Eye-gaze and mid-air gestures are promising for resisting various types of side-channel attacks during authentication. However, to date, a comparison of the different authentication modalities is missing. We investigate multiple authentication ...
Assessing surgeons' skill level in laparoscopic cholecystectomy using eye metrics
- Nishan Gunawardena,
- Michael Matscheko,
- Bernhard Anzengruber,
- Alois Ferscha,
- Martin Schobesberger,
- Andreas Shamiyeh,
- Bettina Klugsberger,
- Peter Solleder
Laparoscopic surgery has revolutionised state of the art in surgical health care. However, its complexity puts a significant burden on the surgeon's cognitive resources resulting in major biliary injuries. With the increasing number of laparoscopic ...
Cited By
-
Shi C, Zhao J, Yang D and Jiang L (2023). i‐MYO: A multi‐grasp prosthetic hand control system based on gaze movements, augmented reality, and myoelectric signals, The International Journal of Medical Robotics and Computer Assisted Surgery, 10.1002/rcs.2617, 20:1, Online publication date: 1-Feb-2024.
- Kothari R, Bailey R, Kanan C, Pelz J and Diaz G (2022). EllSeg-Gen, towards Domain Generalization for Head-Mounted Eyetracking, Proceedings of the ACM on Human-Computer Interaction, 6:ETRA, (1-17), Online publication date: 13-May-2022.
-
Drewes J, Feder S and Einhäuser W (2021). Gaze During Locomotion in Virtual Reality and the Real World, Frontiers in Neuroscience, 10.3389/fnins.2021.656913, 15
-
Kapp S, Barz M, Mukhametov S, Sonntag D and Kuhn J (2021). ARETT: Augmented Reality Eye Tracking Toolkit for Head Mounted Displays, Sensors, 10.3390/s21062234, 21:6, (2234)
- Aronson R and Admoni H Eye Gaze for Assistive Manipulation Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, (552-554)
-
Gardony A, Lindeman R, Brunyé T, Kress B and Peroz C (2020). Eye-tracking for human-centered mixed reality: promises and challenges Optical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR), 10.1117/12.2542699, 9781510633872, (27)
- Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications