Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3448017.3457386acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article
Public Access

Crossed Eyes: Domain Adaptation for Gaze-Based Mind Wandering Models

Published: 25 May 2021 Publication History

Abstract

The effectiveness of user interfaces are limited by the tendency for the human mind to wander. Intelligent user interfaces can combat this by detecting when mind wandering occurs and attempting to regain user attention through a variety of intervention strategies. However, collecting data to build mind wandering detection models can be expensive, especially considering the variety of media available and potential differences in mind wandering across them. We explored the possibility of using eye gaze to build cross-domain models of mind wandering where models trained on data from users in one domain are used for different users in another domain. We built supervised classification models using a dataset of 132 users whose mind wandering reports were collected in response to thought-probes while they completed tasks from seven different domains for six minutes each (five domains are investigated here: Illustrated Text, Narrative Film, Video Lecture, Naturalistic Scene, and Reading Text). We used global eye gaze features to build within- and cross- domain models using 5-fold user-independent cross validation. The best performing within-domain models yielded AUROCs ranging from .57 to .72, which were comparable for the cross-domain models (AUROCs of .56 to .68). Models built from coarse-grained locality features capturing the spatial distribution of gaze resulted in slightly better transfer on average (transfer ratios of .61 vs .54 for global models) due to improved performance in certain domains. Instance-based and feature-level domain adaptation did not result in any improvements in transfer. We found that seven gaze features likely contributed to transfer as they were among the top ten features for at least four domains. Our results indicate that gaze features are suitable for domain adaptation from similar domains, but more research is needed to improve domain adaptation between more dissimilar domains.

References

[1]
João Antunes and Pedro Santana. 2018. A Study on the Use of Eye Tracking to Adapt Gameplay and Procedural Content Generation in First-Person Shooter Games. Multimodal Technol. Interact. 2, 2 (May 2018), 23.
[2]
Robert Bixler and Sidney D'Mello. 2016. Automatic gaze-based user-independent detection of mind wandering during computerized reading. User Model. User-Adapt. Interact. 26, 1 (March 2016), 33–68.
[3]
Nathaniel Blanchard, Robert Bixler, Tera Joyce, and Sidney K. D'Mello. 2014. Automated Physiological-Based Detection of Mind Wandering During Learning. In Intelligent Tutoring Systems, Springer, 55–60.
[4]
Nigel Bosch and Sidney K. D'mello. 2019. Automatic Detection of Mind Wandering from Video in the Lab and in the Classroom. IEEE Trans. Affect. Comput. (April 2019).
[5]
Charles V. Boys. 1890. Soap-bubbles, and the forces which mould them. Cornell University Library. Retrieved August 6, 2016 from http://projecteuclid.org/euclid.chmm/1424377189
[6]
Nitesh V. Chawla, Kevin W. Bowyer, Lawrence O. Hall, and W. Philip Kegelmeyer. 2002. SMOTE: Synthetic Minority Over-Sampling Technique. J. Artif. Intell. Res. 16, 1 (2002), 321–357.
[7]
Kalina Christoff. 2012. Undirected thought: Neural determinants and correlates. Brain Res. 1428, (January 2012), 51–59.
[8]
Kalina Christoff, Alan M. Gordon, Jonathan Smallwood, Rachelle Smith, and Jonathan W. Schooler. 2009. Experience sampling during fMRI reveals default network and executive system contributions to mind wandering. Proc. Natl. Acad. Sci. 106, 21 (May 2009), 8719–8724.
[9]
Kalina Christoff, Zachary C. Irving, Kieran C. R. Fox, R. Nathan Spreng, and Jessica R. Andrews-Hanna. 2016. Mind-wandering as spontaneous thought: a dynamic framework. Nat. Rev. Neurosci. 17, 11 (November 2016), 718–731.
[10]
Kalina Christoff, Caitlin Mills, Jessica R Andrews-Hanna, Zachary C Irving, Evan Thompson, Kieran CR Fox, and Julia WY Kam. 2018. Mind-Wandering as a Scientific Concept: Cutting through the Definitional Haze. Trends Cogn. Sci. 22, 11 (November 2018), 957–959.
[11]
Charles Clifton, Fernanda Ferreira, John M. Henderson, Albrecht W. Inhoff, Simon P. Liversedge, Erik D. Reichle, and Elizabeth R. Schotter. 2016. Eye movements in reading and information processing: Keith Rayner's 40 year legacy. J. Mem. Lang. 86, (January 2016), 1–19.
[12]
M.R. Dias da Silva and M. Postma. 2020. Wandering minds, wandering mice: Computer mouse tracking as a method to detect mind wandering. Comput. Hum. Behav. 112, (November 2020), 106453.
[13]
Sidney K. D'Mello. 2019. Gaze-Based Attention-Aware Cyberlearning Technologies. In Mind, Brain and Technology: How People Learn in the Age of Emerging Technologies. Springer International Publishing, 87–106.
[14]
Sidney K. D'Mello and Art Graesser. 2014. Confusion and its dynamics during device comprehension with breakdown scenarios. Acta Psychol. (Amst.) 151, (September 2014), 106–116.
[15]
Sidney K. D'Mello. 2016. Giving Eyesight to the Blind: Towards Attention-Aware AIED. Int. J. Artif. Intell. Educ. 26, 2 (June 2016), 645–659.
[16]
Sidney K. D'Mello, Caitlin Mills, Robert Bixler, and Nigel Bosch. 2017. Zone out No More: Mitigating Mind Wandering during Computerized Reading. In Proceedings of the 10th International Conference on Educational Data Mining, International Educational Data Mining Society, 8–15.
[17]
Sidney K. D'Mello, Kristopher Kopp, Robert Earl Bixler, and Nigel Bosch. 2016. Attending to Attention: Detecting and Combating Mind Wandering during Computerized Reading. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA ’16), Association for Computing Machinery, New York, NY, USA, 1661–1669.
[18]
Joanna Drummond and Diane Litman. 2010. In the Zone: Towards Detecting Student Zoning Out Using Supervised Machine Learning. In Intelligent Tutoring Systems, Springer, Berlin, Heidelberg, 306–308.
[19]
Myrthe Faber, Robert Bixler, and Sidney K. D'Mello. 2018. An automated behavioral measure of mind wandering during computerized reading. Behav. Res. Methods 50, 1 (February 2018), 134–150.
[20]
Myrthe Faber, Kristina Krasich, Robert E. Bixler, James R. Brockmole, and Sidney K. D'Mello. 2020. The eye–mind wandering link: Identifying gaze indices of mind wandering across tasks. J. Exp. Psychol. Hum. Percept. Perform. 46, 10 (October 2020), 1201–1221.
[21]
Shi Feng, Sidney D'Mello, and Arthur C. Graesser. 2013. Mind Wandering While Reading Easy and Difficult Texts. Psychon. Bull. Rev. 20, 3 (June 2013), 586–592.
[22]
L. Fletcher and A. Zelinsky. 2009. Driver Inattention Detection based on Eye Gaze–Road Event Correlation. Int. J. Robot. Res. 28, 6 (June 2009), 774–801.
[23]
Kieran CR Fox and Roger E Beaty. 2019. Mind-wandering as creative thinking: neural, psychological, and theoretical considerations. Curr. Opin. Behav. Sci. 27, (June 2019), 123–130.
[24]
David J. Frank, Brent Nara, Michela Zavagnin, Dayna R. Touron, and Michael J. Kane. 2015. Validating older adults’ reports of less mind-wandering: An examination of eye movements and dispositional influences. Psychol. Aging 30, 2 (2015), 266–278.
[25]
Michael S. Franklin, Jonathan Smallwood, and Jonathan W. Schooler. 2011. Catching The Mind in Flight: Using Behavioral Indices to Detect Mindless Reading in Real Time. Psychon. Bull. Rev. 18, 5 (October 2011), 992–997.
[26]
Xavier Glorot, Antoine Bordes, and Yoshua Bengio. 2011. Domain Adaptation for Large-Scale Sentiment Classification: A Deep Learning Approach. In Proceedings of the 28th International Conference on International Conference on Machine Learning (ICML’11), Omnipress, Madison, WI, USA, 513–520.
[27]
Mark A. Hall. 1999. Correlation-Based Feature Selection for Machine Learning. PhD Thesis. The University of Waikato.
[28]
Mark Hall, Eibe Frank, Geoffrey Holmes, Bernhard Pfahringer, Peter Reutemann, and Ian H. Witten. 2009. The WEKA Data Mining Software: an Update. ACM SIGKDD Explor. Newsl. 11, 1 (November 2009), 10–18.
[29]
Stephen Hutt, Jessica Hardey, Robert Bixler, Angela Stewart, Evan Risko, and Sidney K. D'Mello. 2017. Gaze-based Detection of Mind Wandering during Lecture Viewing. In Proceedings of the 10th International Conference on Educational Data Mining, 226–231.
[30]
Stephen Hutt, Kristina Krasich, Sidney K. D'mello, and James R. Brockmole. 2021. Breaking out of the Lab: Mitigating Mind Wandering with Gaze-Based Attention-Aware Technology in Classrooms. In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems, ACM, New York.
[31]
Stephen Hutt, Kristina Krasich, Caitlin Mills, Nigel Bosch, Shelby White, James R. Brockmole, and Sidney K. D'Mello. 2019. Automated gaze-based mind wandering detection during computerized learning in classrooms. User Model. User-Adapt. Interact. 29, 4 (September 2019), 821–867.
[32]
Christina Yi Jin, Jelmer P. Borst, and Marieke K. van Vugt. 2019. Predicting task-general mind-wandering with EEG. Cogn. Affect. Behav. Neurosci. 19, 4 (August 2019), 1059–1073.
[33]
Michael J Kane, Leslie H Brown, Jennifer C McVay, Paul J Silvia, Inez Myin-Germeys, and Thomas R Kwapil. 2007. For Whom the Mind Wanders, and When An Experience-Sampling Study of Working Memory and Executive Control in Daily Life. Psychol. Sci. 18, 7 (July 2007), 614–621.
[34]
M. A. Killingsworth and D. T. Gilbert. 2010. A Wandering Mind is an Unhappy Mind. Science 330, 6006 (November 2010), 932–932.
[35]
Martin Kocur, Martin Johannes Dechant, Michael Lankes, Christian Wolff, and Regan Mandryk. 2020. Eye Caramba: Gaze-Based Assistance for Virtual Reality Aiming and Throwing Tasks in Games. In ACM Symposium on Eye Tracking Research and Applications (ETRA ’20 Short Papers), Association for Computing Machinery, New York, NY, USA.
[36]
Oleg V. Komogortsev, Sampath Jayarathna, Do Hyong Koh, and Sandeep Munikrishne Gowda. 2010. Qualitative and quantitative scoring and evaluation of the eye movement classification algorithms. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (ETRA ’10), Association for Computing Machinery, New York, NY, USA, 65–68.
[37]
Oleg V. Komogortsev and Alex Karpov. 2013. Automated classification and scoring of smooth pursuit eye movements in the presence of fixations and saccades. Behav. Res. Methods 45, 1 (March 2013), 203–215.
[38]
Kristopher Kopp, Caitlin Mills, and Sidney K. D'Mello. 2016. Mind wandering during film comprehension: The role of prior knowledge and situational interest. Psychon. Bull. Rev. 23, 3 (June 2016), 842–848.
[39]
Wouter M Kouw, Jesse H Krijthe, and Marco Loog. 2016. Feature-Level Domain Adaptation. J. Mach. Learn. Res. 17, 171 (January 2016), 1–32.
[40]
Wouter M. Kouw and Marco Loog. 2018. An introduction to domain adaptation and transfer learning. ArXiv181211806 Cs Stat (December 2018). Retrieved from http://arxiv.org/abs/1812.11806
[41]
Kristina Krasich, Greg Huffman, Myrthe Faber, and James R. Brockmole. 2020. Where the eyes wander: The relationship between mind wandering and fixation allocation to visually salient and semantically informative static scene content. J. Vis. 20, 9 (September 2020), 10.
[42]
Kristina Krasich, Robert McManus, Stephen Hutt, Myrthe Faber, Sidney K. D'Mello, and James R. Brockmole. 2018. Gaze-based signatures of mind wandering during real-world scene processing. J. Exp. Psychol. Gen. 147, 8 (August 2018), 1111–1124.
[43]
Michael Lankes, Bernhard Maurer, and Barbara Stiglbauer. 2016. An Eye for an Eye: Gaze Input in Competitive Online Games and Its Effects on Social Presence. In Proceedings of the 13th International Conference on Advances in Computer Entertainment Technology (ACE ’16), Association for Computing Machinery, New York, NY, USA.
[44]
Céline Lemercier, Christelle Pêcher, Gaëlle Berthié, Benoit Valéry, Vanessa Vidal, Pierre-Vincent Paubel, Maurice Cour, Alexandra Fort, Cédric Galéra, Catherine Gabaude, Emmanuel Lagarde, and Bertrand Maury. 2014. Inattention behind the wheel: How factual internal thoughts impact attentional control while driving. Saf. Sci. 62, (February 2014), 279–285.
[45]
Sophie I. Lindquist and John P. McLean. 2011. Daydreaming and its correlates in an educational environment. Learn. Individ. Differ. 21, 2 (April 2011), 158–167.
[46]
Lester C. Loschky, Adam M. Larson, Joseph P. Magliano, and Tim J. Smith. 2015. What Would Jaws Do? The Tyranny of Film and the Relationship between Gaze and Higher-Level Narrative Film Comprehension. PLOS ONE 10, 11 (November 2015), 1–23.
[47]
Caitlin Mills, Robert Bixler, Xinyi Wang, and Sidney K. D'Mello. 2016. Automatic Gaze-Based Detection of Mind Wandering during Film Viewing. In Proceedings of the 9th International Conference on Educational Data Mining, International Educational Data Mining Society, 30–37.
[48]
Caitlin Mills, Nigel Bosch, Kristina Krasich, and Sidney K. D'Mello. 2019. Reducing Mind-Wandering During Vicarious Learning from an Intelligent Tutoring System. In Artificial Intelligence in Education, Seiji Isotani, Eva Millán, Amy Ogan, Peter Hastings, Bruce McLaren and Rose Luckin (eds.). Springer International Publishing, Cham, 296–307.
[49]
Caitlin Mills, Sidney K. D'Mello, Nigel Bosch, and Andrew M. Olney. 2015. Mind Wandering During Learning with an Intelligent Tutoring System. In Artificial Intelligence in Education, Springer, Cham, 267–276.
[50]
Caitlin Mills and Sidney K. D'Mello. 2015. Toward a Real-time (Day) Dreamcatcher: Detecting Mind Wandering Episodes During Online Reading. In Proceedings of the 8th International Conference on Educational Data Mining, International Educational Data Mining Society (IEDMS), 69–76. Retrieved from http://www.educationaldatamining.org/EDM2015/proceedings/full69-76.pdf
[51]
Caitlin Mills, Julie Gregg, Robert Bixler, and Sidney K D'Mello. 2020. Eye-Mind reader: an intelligent reading interface that promotes long-term comprehension by detecting and responding to mind wandering. Human–Computer Interact. (2020), 1–27.
[52]
Yukiko I Nakano, Cristina Conati, and Thomas Bader. 2013. Eye gaze in intelligent user interfaces: gaze-based analyses, models and applications (1st ed.). Springer-Verlag London.
[53]
Phuong Pham and Jingtao Wang. 2015. AttentiveLearner: Improving Mobile MOOC Learning via Implicit Heart Rate Tracking. In Artificial Intelligence in Education, Springer, Cham, 367–376.
[54]
Jason G. Randall, Frederick L. Oswald, and Margaret E. Beier. 2014. Mind-wandering, cognition, and performance: A theory-driven meta-analysis of attention regulation. Psychol. Bull. 140, 6 (November 2014), 1411–1431.
[55]
Keith Rayner. 1998. Eye Movements in Reading and Information Processing: 20 Years of Research. Psychol. Bull. 124, 3 (November 1998), 372–422.
[56]
Keith Rayner. 2009. Eye movements and attention in reading, scene perception, and visual search. Q. J. Exp. Psychol. 62, 8 (August 2009), 1457–1506.
[57]
Erik D. Reichle, A. E. Reineberg, and J. W. Schooler. 2010. Eye Movements During Mindless Reading. Psychol. Sci. 21, 9 (August 2010), 1300–1310.
[58]
Ian H Robertson, Tom Manly, Jackie Andrade, Bart T Baddeley, and Jenny Yiend. 1997. “Oops!”: Performance Correlates of Everyday Attentional Failures in Traumatic Brain Injured and Normal Subjects. Neuropsychologia 35, 6 (June 1997), 747–758.
[59]
Claudia Roda and Julie Thomas. 2006. Attention aware systems: Theories, applications, and research agenda. Comput. Hum. Behav. 22, 4 (July 2006), 557–587.
[60]
Southwell Rosy, Julie Gregg, Robert Bixler, and Sidney K. D'mello. In Press. What Eye Movements Reveal about Comprehension during Naturalistic Reading of Long, Connected Texts. Cogn. Sci. (In Press).
[61]
Jonathan W. Schooler, Erik D. Reichle, and David V. Halpern. 2004. Zoning Out While Reading: Evidence for Dissociations Between Experience and Metaconsciousness. In Thinking and seeing: visual metacognition in adults and children, Daniel T Levin (ed.). MIT Press, Cambridge, Mass., 203–226.
[62]
Jonathan W. Schooler, Jonathan Smallwood, Kalina Christoff, Todd C. Handy, Erik D. Reichle, and Michael A. Sayette. 2011. Meta-awareness, perceptual decoupling and the wandering mind. Trends Cogn. Sci. (June 2011).
[63]
Pennie S. Seibert and Henry C. Ellis. 1991. Irrelevant Thoughts, Emotional Mood States, and Cognitive Task Performance. Mem. Cognit. 19, 5 (1991), 507–513.
[64]
Paul Seli, Roger E Beaty, James Allan Cheyne, Daniel Smilek, Jonathan Oakman, and Daniel L Schacter. 2018a. How pervasive is mind wandering, really? Conscious. Cogn. 66, (2018), 74–78.
[65]
Paul Seli, Michael J. Kane, Thomas Metzinger, Jonathan Smallwood, Daniel L. Schacter, David Maillet, Jonathan W. Schooler, and Daniel Smilek. 2018b. The Family-Resemblances Framework for Mind-Wandering Remains Well Clad. Trends Cogn. Sci. 22, 11 (November 2018), 959–961.
[66]
Paul Seli, Michael J. Kane, Jonathan Smallwood, Daniel L. Schacter, David Maillet, Jonathan W. Schooler, and Daniel Smilek. 2018c. Mind-Wandering as a Natural Kind: A Family-Resemblances View. Trends Cogn. Sci. 22, 6 (June 2018), 479–490.
[67]
Hidetoshi Shimodaira. 2000. Improving predictive inference under covariate shift by weighting the log-likelihood function. J. Stat. Plan. Inference 90, 2 (October 2000), 227–244.
[68]
Jonathan Smallwood. 2011. Mind-wandering While Reading: Attentional Decoupling, Mindless Reading and the Cascade Model of Inattention. Lang. Linguist. Compass 5, 2 (February 2011), 63–77.
[69]
Jonathan Smallwood, Emily Beach, Jonathan W. Schooler, and Todd C. Handy. 2008. Going AWOL in the brain: Mind wandering reduces cortical analysis of external events. J. Cogn. Neurosci. 20, 3 (2008), 458–469.
[70]
Jonathan Smallwood, John B. Davies, Derek Heim, Frances Finnigan, Megan Sudberry, Rory O'Connor, and Marc Obonsawin. 2004. Subjective Experience and the Attentional Lapse: Task Engagement and Disengagement During Sustained Attention. Conscious. Cogn. 13, 4 (December 2004), 657–690.
[71]
Jonathan Smallwood, Daniel J. Fishman, and Jonathan W. Schooler. 2007a. Counting the Cost of an Absent Mind: Mind Wandering as an Underrecognized Influence on Educational Performance. Psychon. Bull. Rev. 14, 2 (2007), 230–236.
[72]
Jonathan Smallwood, Merrill McSpadden, and Jonathan W. Schooler. 2007b. The Lights are On but No One's Home: Meta-Awareness and the Decoupling of Attention when the Mind Wanders. Psychon. Bull. Rev. 14, 3 (2007), 527–533.
[73]
Jonathan Smallwood and Jonathan W. Schooler. 2006. The Restless Mind. Psychol. Bull. 132, 6 (2006), 946–958.
[74]
D. Smilek, J. S. A. Carriere, and J. A. Cheyne. 2010. Out of Mind, Out of Sight: Eye Blinking as Indicator and Embodiment of Mind Wandering. Psychol. Sci. 21, 6 (June 2010), 786–789.
[75]
Angela Stewart, Nigel Bosch, Huili Chen, Patrick Donnelly, and Sidney K. D'Mello. 2017a. Face forward: Detecting mind wandering from video during narrative film comprehension. In International Conference on Artificial Intelligence in Education, Springer, Cham, 359–370.
[76]
Angela Stewart, Nigel Bosch, and Sidney K. D'Mello. 2017b. Generalizability of Face-Based Mind Wandering Detection Across Task Contexts. In International Educational Data Mining Society, International Educational Data Mining Society, 88–95.
[77]
Veronica Sundstedt. 2012. Gazing at games: An introduction to eye tracking control. Synth. Lect. Comput. Graph. Animat. 5, 1 (2012), 1–113.
[78]
Ashish Tawari, Sayanan Sivaraman, Mohan Manubhai Trivedi, Trevor Shannon, and Mario Tippelhofer. 2014. Looking-in and looking-out vision for urban intelligent assistance: Estimation of driver attentive state and dynamic surround for safe merging and braking. In Intelligent Vehicles Symposium Proceedings, IEEE, 115–120.
[79]
Nash Unsworth and Brittany D. McMillan. 2013. Mind Wandering and Reading Comprehension: Examining the Roles of Working Memory Capacity, Interest, Motivation, and Topic Experience. J. Exp. Psychol. Learn. Mem. Cogn. 39, 3 (2013), 832–842.
[80]
Adrian Voßkühler, Volkhard Nordmeier, Lars Kuchinke, and Arthur M. Jacobs. 2008. OGAMA (Open Gaze and Mouse Analyzer): Open-Source Software Designed to Analyze Eye and Mouse Movements in Slideshow Study Designs. Behav. Res. Methods 40, 4 (November 2008), 1150–1162.
[81]
Yana Weinstein. 2018. Mind-wandering, how do I measure thee with probes? Let me count the ways. Behav. Res. Methods 50, 2 (April 2018), 642–661.
[82]
Jeffrey M. Zacks. 2010. The brain's cutting-room floor: segmentation of narrative cinema. Front. Hum. Neurosci. 4, (2010).

Cited By

View all
  • (2024)Uncovering and Addressing Blink-Related Challenges in Using Eye Tracking for Interactive SystemsProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642086(1-23)Online publication date: 11-May-2024
  • (2024)From the Lab to the Wild: Examining Generalizability of Video-based Mind Wandering DetectionInternational Journal of Artificial Intelligence in Education10.1007/s40593-024-00412-2Online publication date: 17-Jun-2024
  • (2023)Towards Automatic Detection of Participant Attention in Virtual Meetings2023 Congress in Computer Science, Computer Engineering, & Applied Computing (CSCE)10.1109/CSCE60160.2023.00445(2731-2733)Online publication date: 24-Jul-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '21 Full Papers: ACM Symposium on Eye Tracking Research and Applications
May 2021
122 pages
ISBN:9781450383448
DOI:10.1145/3448017
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 25 May 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Domain Adaptation
  2. Eye Movements
  3. Mind Wandering
  4. User Modeling

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

Conference

ETRA '21
Sponsor:

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)184
  • Downloads (Last 6 weeks)45
Reflects downloads up to 12 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Uncovering and Addressing Blink-Related Challenges in Using Eye Tracking for Interactive SystemsProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642086(1-23)Online publication date: 11-May-2024
  • (2024)From the Lab to the Wild: Examining Generalizability of Video-based Mind Wandering DetectionInternational Journal of Artificial Intelligence in Education10.1007/s40593-024-00412-2Online publication date: 17-Jun-2024
  • (2023)Towards Automatic Detection of Participant Attention in Virtual Meetings2023 Congress in Computer Science, Computer Engineering, & Applied Computing (CSCE)10.1109/CSCE60160.2023.00445(2731-2733)Online publication date: 24-Jul-2023
  • (2023)A social robot as your reading companion: exploring the relationships between gaze patterns and knowledge gainsJournal on Multimodal User Interfaces10.1007/s12193-023-00418-518:1(21-41)Online publication date: 12-Oct-2023
  • (2022)Predicting Mind-Wandering with Facial Videos in Online Lectures2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)10.1109/CVPRW56347.2022.00228(2103-2112)Online publication date: Jun-2022
  • (2022)The adaptive significance of human scleral brightness: an experimental studyScientific Reports10.1038/s41598-022-24403-212:1Online publication date: 24-Nov-2022

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media