Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3208159.3208180acmotherconferencesArticle/Chapter ViewAbstractPublication PagescgiConference Proceedingsconference-collections
research-article

Real-Time Eye-Gaze Based Interaction for Human Intention Prediction and Emotion Analysis

Published: 11 June 2018 Publication History

Abstract

The human eye's state of motion and content of interest can express people's cognitive status and emotional status based on their situation. When observing the surrounding things, the human eyes make different eye movements according to the observed objects which reflects human's attention and interest. In this paper, we capture and analyze patterns of human eye-gaze behavior and head motion and classify them into different categories. Besides, we compute and train the eye-object movement attention model and eye-object feature preference model based on different peoples' eye-gaze behaviors by using machine learning algorithms. These models are used to predict humans' object of interest and the interaction intention according to people's real-time situation. Furthermore, the eye-gaze behavior and head motion patterns can be used as a modality of non-verbal information in the computing of human emotional states based on the PAD affective computing model. Our methodology analyzes human emotion and cognition status from the aspect of eye-gaze behavior and head motion, understands the cognitive information that human eyes can express, and effectively improves the efficiency of human-computer interaction in different circumstances.

References

[1]
Langton S R H. How the eyes affect the I: gaze perception, cognition, and the robot-human interface." IEEE International Workshop on Robot and Human Interactive Communication, 2001. Proceedings. IEEE, 2001:359--365.
[2]
Maria Guarnera, Zira Hichy, Maura I. Cascio, Stefano Carrubba "Facial Expressions and Ability to Recognize Emotions From Eyes or Mouth in Children", Europe's Journal of Psychology, Vol. 11(2), 2015
[3]
Lance B, Marsella S C. The Relation between Gaze Behavior and the Attribution of Emotion: An Empirical Study{M}// Intelligent Virtual Agents. Springer Berlin Heidelberg, 2008:1--14.
[4]
https://www.tobii.com/group/news-media/press-releases/2017/8/tobii-and-microsoft-collaborate-to-bring-eye-tracking-support-in-windows-10/
[5]
Vicente F, Huang Z, Xiong X, et al. Driver Gaze Tracking and Eyes Off the Road Detection System{J}. IEEE Transactions on Intelligent Transportation Systems, 2015, 16(4):2014--2027.
[6]
Sundstedt V. Gazing at games:using eye tracking to control virtual characters." 2010:1--160.
[7]
Borys M, Barakate S, Hachmoud K, et al. Classification of user performance in the Ruff Figural Fluency Test based on eye-tracking features{J}. 2017, 15(380):02002.
[8]
Reading performance Using Eye Tracking to Assess Reading performance in Patients with Glaucoma: A Within-Person Study, Nicholas D. Smith, Fiona C. Glen, Vera M. Mönter, and David P. Crabb, Hindawi Publishing Corporation Journal of Ophthalmology Volume 2014, Article ID 120528, 10 pages
[9]
Hofler M, Wesiak G, Purcher P, et al. Modern education and its background in cognitive psychology: Automated question creation and eye movements." International Convention on Information and Communication Technology, Electronics and Microelectronics. 2017:619--623.
[10]
Swearingen T, Ross A. A Label Propagation Approach for Predicting Missing Biographic Labels in Face-Based Biometric Records{J}. 2017.
[11]
Li Q, Li P, Zhang Q, et al. Icing load prediction for overhead power lines based on SVM." International Conference on Modelling, Identification and Control. IEEE, 2011:104--108.
[12]
Buschjäger S, Morik K. Decision Tree and Random Forest Implementations for Fast Filtering of Sensor Data{J}. IEEE Transactions on Circuits & Systems I Regular Papers, 2017, PP(99):1--14.
[13]
https://en.wikipedia.org/wiki/Kinematics
[14]
Wade J, Zhang L, Bian D, et al. A Gaze-Contingent Adaptive Virtual Reality Driving Environment for Intervention in Individuals with Autism Spectrum Disorders{J}. Acm Transactions on Interactive Intelligent Systems, 2016, 6(1):3.
[15]
Ruhland K, Peters C E, Andrist S, et al. A Review of Eye Gaze in Virtual Agents, Social Robotics and HCI: Behaviour Generation, User Interaction and Perception{J}. Computer Graphics Forum, 2015, 34(6):299--326.
[16]
Bâce M. Augmenting human interaction capabilities with proximity, natural gestures, and eye gaze." The, International Conference. 2017:1--3.
[17]
Zhao Q, Yuan X, Tu D, et al. Eye moving behaviors identification for gaze tracking interaction{J}. Journal on Multimodal User Interfaces, 2015, 9(2):89--104.
[18]
Zheng R, Nakano K, Ishiko H, et al. Eye-Gaze Tracking Analysis of Driver Behavior While Interacting With Navigation Systems in an Urban Area{J}. IEEE Transactions on Human-Machine Systems, 2016, 46(4):546--556.
[19]
Kern D, Marshall P, Schmidt A. Gazemarks:gaze-based visual placeholders to ease attention switching." Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2010:2093--2102.
[20]
Yun S P, Outram B I, Tag B, et al. GazeSphere: navigating 360-degree-video environments in VR using head rotation and eye gaze." ACM SIGGRAPH. ACM, 2017:1--2.
[21]
Springer. Journal on Multimodal User Interfaces{J}. 2015.
[22]
Xiahou J, He H, Wei K, et al. Integrated Approach of Dynamic Human Eye Movement Recognition and Tracking in Real-time." International Conference on Virtual Reality and Visualization. IEEE, 2017:94--101.
[23]
Kao T C, Sun T Y. Head pose recognition in advanced Driver Assistance System." IEEE, Global Conference on Consumer Electronics. IEEE, 2017:1--3.
[24]
C. Z. Liu, H. Aliamani and M. Kavakli, "Behavior-intention analysis and human-aware computing: Case study and discussion," 2017 12th IEEE Conference on Industrial Electronics and Applications (ICIEA), Siem Reap, Cambodia, 2017, pp. 516--521.
[25]
Albert Mehrabian. Pleasure-arousaldominance: A general framework for describing and measuring individual differences in temperament. Current Psychology, 14(4):261--292, 1996.
[26]
Brent Lance and Stacy C Marsella. The relation between gaze behavior and the attribution of emotion: An empirical study. In International Workshop on Intelligent Virtual Agents, pages 1--14. Springer, 2008.
[27]
Albert Mehrabian. Pleasure-arousaldominance: A general framework for describing and measuring individual differences in temperament. Current Psychology,14(4):261--292, 1996.

Cited By

View all
  • (2024)Machine Learning Techniques for Emotion Detection Using Eye Gaze LocalisationMachine and Deep Learning Techniques for Emotion Detection10.4018/979-8-3693-4143-8.ch002(24-60)Online publication date: 14-May-2024
  • (2024)Methods for Detecting the Patient’s Pupils’ Coordinates and Head Rotation Angle for the Video Head Impulse Test (vHIT), Applicable for the Diagnosis of Vestibular Neuritis and Pre-Stroke ConditionsComputation10.3390/computation1208016712:8(167)Online publication date: 18-Aug-2024
  • (2024)Advancements in Gaze Coordinate Prediction Using Deep Learning: A Novel Ensemble Loss ApproachApplied Sciences10.3390/app1412533414:12(5334)Online publication date: 20-Jun-2024
  • Show More Cited By

Index Terms

  1. Real-Time Eye-Gaze Based Interaction for Human Intention Prediction and Emotion Analysis

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    CGI 2018: Proceedings of Computer Graphics International 2018
    June 2018
    284 pages
    ISBN:9781450364010
    DOI:10.1145/3208159
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 11 June 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Eye-gaze interaction
    2. machine learning
    3. robot and vision

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    CGI 2018
    CGI 2018: Computer Graphics International 2018
    June 11 - 14, 2018
    Island, Bintan, Indonesia

    Acceptance Rates

    CGI 2018 Paper Acceptance Rate 35 of 159 submissions, 22%;
    Overall Acceptance Rate 35 of 159 submissions, 22%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)124
    • Downloads (Last 6 weeks)14
    Reflects downloads up to 04 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Machine Learning Techniques for Emotion Detection Using Eye Gaze LocalisationMachine and Deep Learning Techniques for Emotion Detection10.4018/979-8-3693-4143-8.ch002(24-60)Online publication date: 14-May-2024
    • (2024)Methods for Detecting the Patient’s Pupils’ Coordinates and Head Rotation Angle for the Video Head Impulse Test (vHIT), Applicable for the Diagnosis of Vestibular Neuritis and Pre-Stroke ConditionsComputation10.3390/computation1208016712:8(167)Online publication date: 18-Aug-2024
    • (2024)Advancements in Gaze Coordinate Prediction Using Deep Learning: A Novel Ensemble Loss ApproachApplied Sciences10.3390/app1412533414:12(5334)Online publication date: 20-Jun-2024
    • (2024)Embedding Eye Gaze and Emotions into Tangible Social Media for Transforming Vandalism into Arts2024 International Telecommunications Conference (ITC-Egypt)10.1109/ITC-Egypt61547.2024.10620597(836-841)Online publication date: 22-Jul-2024
    • (2024)Physiological Data for User Experience and Quality of Experience: A Systematic Review (2018–2022)International Journal of Human–Computer Interaction10.1080/10447318.2024.2311972(1-30)Online publication date: 13-Feb-2024
    • (2023)Gaze Estimation Based on Convolutional Structure and Sliding Window-Based Attention MechanismSensors10.3390/s2313622623:13(6226)Online publication date: 7-Jul-2023
    • (2023)Evaluating User Interactions in Wearable Extended Reality: Modeling, Online Remote Survey, and In-Lab Experimental MethodsIEEE Access10.1109/ACCESS.2023.329859811(77856-77872)Online publication date: 2023
    • (2023)Boosted Gaze Gesture Recognition Using Underlying Head Orientation SequenceIEEE Access10.1109/ACCESS.2023.327028511(43675-43689)Online publication date: 2023
    • (2023)iBEHAVE: Behaviour Analysis Using Eye Gaze MetricesPattern Recognition and Machine Intelligence10.1007/978-3-031-45170-6_27(260-269)Online publication date: 4-Dec-2023
    • (2022)Using Gaze-based Interaction to Alleviate Situational Mobility Impairment in Extended RealityProceedings of the Human Factors and Ergonomics Society Annual Meeting10.1177/107118132266122466:1(435-439)Online publication date: 27-Oct-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media