Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3508546.3508645acmotherconferencesArticle/Chapter ViewAbstractPublication PagesacaiConference Proceedingsconference-collections
research-article

Bayesian Intention Inference Based on Human Visual Signal

Published: 25 February 2022 Publication History
  • Get Citation Alerts
  • Abstract

    In the robot-assisted technology designed for the disabled, it is very challenging to infer the potential intentions based on accurate eye movements. This paper proposes a framework to capture the user's eye movement characteristics, infer their implicit intentions, and realize the interaction between humans and assistive robots. The framework uses the I-DT algorithm to extract the intentional gaze points and uses the DBSCAN algorithm to cluster their intentional gaze points to identify the subject's area of interest and the corresponding objects under the area of interest, thereby building a small intent knowledge base. Using this as a priori information, when the intention is unknown, the Naive Bayes model is used to infer the intention of the eye movement event. The empirical results of 9 subjects using the Pupil Core eye tracker show that the frame accuracy rate can reach 86.7%, indicating that the proposed eye movement event recognition and intention inference have certain validity and accuracy.

    References

    [1]
    Bilmes J, Li X, and Malkin J. The Vocal Joystick: A voice-based human-computer interface for individuals with motor impairments[C]//Proceedings of Human Language Technology Conference and Conference on Empirical Methods in Natural Language Processing. 2005: 995-1002.
    [2]
    Bandyopadhyay T, Won K S and Frazzoli E. Intention-aware motion planning[M]//Algorithmic foundations of robotics X. Springer, Berlin, Heidelberg, 2013: 475-491.
    [3]
    Ortigue S, Thompson J C and Parasuraman R. Spatio-temporal dynamics of human intention understanding in temporo-parietal cortex: a combined EEG/fMRI repetition suppression paradigm[J]. PloS one, 2009, 4(9): e6962.
    [4]
    Chavarriaga R and Millán J R. Learning from EEG error-related potentials in noninvasive brain-computer interfaces[J]. IEEE transactions on neural systems and rehabilitation engineering, 2010, 18(4): 381-388.
    [5]
    van Osch M, Bera D and Koks Y. Tele-operated service robots for household and care[C]//Proceedings full papers ISG* ISARC2012: joint conference of the 8th World Conference of the International Society for Gerontechnology (ISG) and the 29th International Symposium on Automation and Robotics in Construction (ISARC), June 26-29, 2012, Eindhoven, The Netherlands. Technische Universiteit Eindhoven, 2012: 263-269.
    [6]
    King C H, Chen T L and Jain A. Towards an assistive robot that autonomously performs bed baths for patient hygiene[C]//2010 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2010: 319-324.
    [7]
    Selker T. Visual attentive interfaces[J]. BT Technology journal, 2004, 22(4): 146-150.
    [8]
    Goldwater B C. Psychological significance of pupillary movements[J]. Psychological bulletin, 1972, 77(5): 340.
    [9]
    Li S and Zhang X. Implicit human intention inference through gaze cues for people with limited motion ability[C]//2014 IEEE International Conference on Mechatronics and Automation. IEEE, 2014: 257-262.
    [10]
    Krassanakis V, Filippakopoulou V and Nakos B. EyeMMV toolbox: An eye movement post-analysis tool based on a two-step spatial dispersion threshold for fixation identification[J]. Journal of Eye Movement Research, 2014, 7(1).
    [11]
    Guo Y J. Improved DBSCAN algorithm and its application research[D].North University of China, 2020.
    [12]
    Wang J M. Improvement and Research on Naive Bayes Classification Algorithm [D].Harbin University of Science and Technology, 2021.

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    ACAI '21: Proceedings of the 2021 4th International Conference on Algorithms, Computing and Artificial Intelligence
    December 2021
    699 pages
    ISBN:9781450385053
    DOI:10.1145/3508546
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 25 February 2022

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Gaze
    2. Human-robot interaction
    3. Intention inference

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    ACAI'21

    Acceptance Rates

    Overall Acceptance Rate 173 of 395 submissions, 44%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 66
      Total Downloads
    • Downloads (Last 12 months)13
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 26 Jul 2024

    Other Metrics

    Citations

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media