Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2370216.2370368acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Detecting eye contact using wearable eye-tracking glasses

Published: 05 September 2012 Publication History

Abstract

We describe a system for detecting moments of eye contact between an adult and a child, based on a single pair of gaze-tracking glasses which are worn by the adult. Our method utilizes commercial gaze tracking technology to determine the adult's point of gaze, and combines this with computer vision analysis of video of the child's face to determine their gaze direction. Eye contact is then detected as the event of simultaneous, mutual looking at faces by the dyad. We report encouraging findings from an initial implementation and evaluation of this approach.

References

[1]
O. Aghazadeh, J. Sullivan, and S. Carlsson. Novelty detection from an ego-centric perspective. In CVPR, 2011.
[2]
L. Breiman. Random forests. Mach. Learn., 45(1):5--32, Oct. 2001.
[3]
A. Bulling and D. Roggen. Recognition of visual memory recall processes using eye movement analysis. In UbiComp, 2011.
[4]
A. Bulling, J. A. Ward, H. Gellersen, and G. Troster. Robust recognition of reading activity in transit using wearable electrooculography. In Pervasive Computing, 2008.
[5]
K. Chawarska and F. Shic. Looking but not seeing: Atypical visual scanning and recognition of faces in 2 and 4-year-old children with autism spectrum disorder. Journal of Autism and Developmental Disorders, 39:1663--1672, 2009.
[6]
G. Dawson, K. Toth, R. Abbott, J. Osterling, J. Munson, A. Estes, and J. Liaw. Early social attention impairments in autism: Social orienting, joint attention, and attention to distress. Developmental Psychology, 40(2):271--283, 2004.
[7]
W. Einhauser, M. Spain, and P. Perona. Objects predict fixations better than early saliency. In Journal of Vision, 2008.
[8]
A. Fathi, A. Farhadi, and J. M. Rehg. Understanding egocentric activities. In ICCV, 2011.
[9]
A. Fathi, J. K. Hodgins, and J. M. Rehg. Social interactions: A first-person perspective. In CVPR, 2012.
[10]
A. Fathi, X. Ren, and J. M. Rehg. Learning to recognize objects in egocentric activities. In CVPR, 2011.
[11]
J. M. Franchak, K. S. Kretch, K. C. Soska, J. S. Babcock, and K. E. Adolph. Head-mounted eye-tracking of infants' natural interactions: a new method. In Proceedings of the 2010 Symposium on Eye-Tracking Research and Applications, ETRA '10, pages 21--27, 2010.
[12]
J. Guo and G. Feng. How eye gaze feedback changes parent-child joint attention in shared storybook reading? an eye-tracking intervention study. In 2nd Workshop on Eye Gaze in Intelligent Human Machine Interaction, 2011.
[13]
D. W. Hansen and Q. Ji. In the eye of the beholder: A survey of models for eyes and gaze. PAMI, 32(3):478--500, Mar. 2010.
[14]
W. Jones, K. Carr, and A. Klin. Absence of preferential looking to the eyes of approaching adults predicts level of social disability in 2-year-old toddlers with autism spectrum disorder. Archives of General Psychiatry, 65(8):946--954, 2008.
[15]
K. M. Kitani, T. Okabe, Y. Sato, and A. Sugimoto. Fast unsupervised ego-action learning for first-person sports videos. In CVPR, 2011.
[16]
A. Klin, W. Jones, R. Schultz, F. Volkmar, and D. Cohen. Visual fixation patterns during viewing of naturalistic social situations as predictors of social competence in individuals with autism. In Archives of General Psychiatry, 2002.
[17]
M. F. Land and M. Hayhoe. In what ways do eye movements contribute to everyday activities? Vision Research, 41:3559--3565, 2001.
[18]
Y. J. Lee, J. Ghosh, and K. Grauman. Discovering important people and objects for egocentric video summarization. In CVPR, 2012.
[19]
S. R. Leekam, B. Lopez, and C. Moore. Attention and joint attention in preschool children with autism. Developmental Psychology, 36(2):261--273, 2000.
[20]
A. Liu and D. Salvucci. Modeling and prediction of human driver behavior. In HCI, 2001.
[21]
D. Model and M. Eizenman. A probabilistic approach for the estimation of angle kappa in infants. In Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA '12, pages 53--58, 2012.
[22]
B. Noris, M. Barker, J. Nadel, F. Hentsch, F. Ansermet, and A. Billard. Measuring gaze of children with autism spectrum disorders in naturalistic interactions. In Engineering in Medicine and Biology Society, EMBC, 2011 Annual International Conference of the IEEE, pages 5356--5359, 30 2011-sept. 3 2011.
[23]
J. B. Pelz and R. Consa. Oculomotor behavior and perceptual strategies in complex tasks. In Vision Research, 2001.
[24]
L. Piccardi, B. Noris, O. Barbey, A. Billard, F. Keller, and et al. Wearcam: A head mounted wireless camera for monitoring gaze attention and for the diagnosis of developmental disorders in young children. In 16th IEEE International Simposium on Robot and Human Interactive Communication, 2007.
[25]
H. Pirsiavash and D. Ramanan. Detecting activities of daily living in first-person camera views. In CVPR, 2012.
[26]
X. Ren and C. Gu. Figure-ground segmentation improves handled object recognition in egocentric video. In CVPR, 2010.
[27]
B. Schiele, N. Oliver, T. Jebara, and A. Pentland. An interactive computer vision system - dypers: dynamic personal enhanced reality system. In ICVS, 1999.
[28]
R. Schleicher, N. Galley, S. Briest, and L. Galley. Blinks and saccades are indicators of fatigue in sleepiness warnings: looking tired? In Ergonomics, 2008.
[29]
A. Senju and M. H. Johnson. Atypical eye contact in autism: models, mechanisms and development. Neuroscience and biobehavioral reviews, 33(8):1204--1214, 2009.
[30]
J. Skotte, J. Nojgaard, L. Jorgensen, K. Christensen, and G. Sjogaard. Eye blink frequency during different computer tasks quantified by electrooculography. In European Journal of Applied Physiology, 2007.
[31]
E. H. Spriggs, F. D. L. Torre, and M. Hebert. Temporal segmentation and activity classification from first-person sensing. In Egovision Workshop, 2009.
[32]
E. Stuyven, K. V. der Goten, A. Vandierendonck, K. Claeys, and L. Crevits. The effect of cognitive load on saccadic eye movements. In Acta Psychologica, 2000.
[33]
B. W. Tatler, M. M. Hayhoe, M. F. Land, and D. H. Ballard. Eye guidance in natural vision: reinterpreting salience. In Journal of Vision, 2011.
[34]
A. M. Wetherby, N. Watt, L. Morgan, and S. Shumway. Social communication profiles of children with autism spectrum disorders late in the second year of life. In Journal of Autism and Developmental Disorders, 2007.
[35]
W. Yi and D. Ballard. Recognizing behavior in hand-eye coordination patterns. In International Journal of Humanoid Robots, 2009.
[36]
L. Zwaigenbaum, S. E. Bryson, T. Rogers, W. Roberts, J. Brian, and P. Szatmari. Behavioral manifestations of autism in the first year of life. In International Journal of Developmental Neuroscience, 2005.

Cited By

View all
  • (2024)Less is More: Adaptive Feature Selection and Fusion for Eye Contact DetectionProceedings of the 32nd ACM International Conference on Multimedia10.1145/3664647.3688987(11390-11396)Online publication date: 28-Oct-2024
  • (2024)GazeTrak: Exploring Acoustic-based Eye Tracking on a Glass FrameProceedings of the 30th Annual International Conference on Mobile Computing and Networking10.1145/3636534.3649376(497-512)Online publication date: 29-May-2024
  • (2024)NITEC: Versatile Hand-Annotated Eye Contact Dataset for Ego-Vision Interaction2024 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)10.1109/WACV57701.2024.00438(4425-4434)Online publication date: 3-Jan-2024
  • Show More Cited By

Index Terms

  1. Detecting eye contact using wearable eye-tracking glasses

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UbiComp '12: Proceedings of the 2012 ACM Conference on Ubiquitous Computing
    September 2012
    1268 pages
    ISBN:9781450312240
    DOI:10.1145/2370216
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 05 September 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. attention
    2. autism
    3. developmental disorders
    4. eye contact
    5. wearable eye-tracking

    Qualifiers

    • Research-article

    Conference

    Ubicomp '12
    Ubicomp '12: The 2012 ACM Conference on Ubiquitous Computing
    September 5 - 8, 2012
    Pennsylvania, Pittsburgh

    Acceptance Rates

    UbiComp '12 Paper Acceptance Rate 58 of 301 submissions, 19%;
    Overall Acceptance Rate 764 of 2,912 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)74
    • Downloads (Last 6 weeks)18
    Reflects downloads up to 13 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Less is More: Adaptive Feature Selection and Fusion for Eye Contact DetectionProceedings of the 32nd ACM International Conference on Multimedia10.1145/3664647.3688987(11390-11396)Online publication date: 28-Oct-2024
    • (2024)GazeTrak: Exploring Acoustic-based Eye Tracking on a Glass FrameProceedings of the 30th Annual International Conference on Mobile Computing and Networking10.1145/3636534.3649376(497-512)Online publication date: 29-May-2024
    • (2024)NITEC: Versatile Hand-Annotated Eye Contact Dataset for Ego-Vision Interaction2024 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)10.1109/WACV57701.2024.00438(4425-4434)Online publication date: 3-Jan-2024
    • (2024) DPGazeSynthInformation Sciences: an International Journal10.1016/j.ins.2024.120720675:COnline publication date: 1-Jul-2024
    • (2024)BIM-based augmented reality navigation for indoor emergency evacuationExpert Systems with Applications: An International Journal10.1016/j.eswa.2024.124469255:PAOnline publication date: 1-Dec-2024
    • (2024)EG-Net: Appearance-based eye gaze estimation using an efficient gaze network with attention mechanismExpert Systems with Applications10.1016/j.eswa.2023.122363238(122363)Online publication date: Mar-2024
    • (2024)An Outlook into the Future of Egocentric VisionInternational Journal of Computer Vision10.1007/s11263-024-02095-7132:11(4880-4936)Online publication date: 28-May-2024
    • (2024)From Sensory Perception to Realtime NonVerbal CommunicationThe Sensory Accommodation Framework for Technology10.1007/978-3-031-48843-6_6(85-99)Online publication date: 2-Jan-2024
    • (2024)Integration patterns in the use of metadata for data sense‐making during relevance evaluation: An interpretable deep learning‐based predictionJournal of the Association for Information Science and Technology10.1002/asi.24961Online publication date: 29-Oct-2024
    • (2023)GlassesValidator: A data quality tool for eye tracking glassesBehavior Research Methods10.3758/s13428-023-02105-556:3(1476-1484)Online publication date: 8-Jun-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media