Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2647868.2655063acmconferencesArticle/Chapter ViewAbstractPublication PagesmmConference Proceedingsconference-collections
poster

Human Computer Interface for Quadriplegic People Based on Face Position/gesture Detection

Published: 03 November 2014 Publication History

Abstract

This paper proposes a human computer interface using a single depth camera for quadriplegic people. The nose position is employed to control the cursor along with the commands provided by mouth's status. The detection of nose position and mouth's status is based on randomized decision tree algorithm.The experimental results show that the proposed interface is comfortable, easy to use, robust, and outperforms the existing assistive technology.

References

[1]
M. Betke, J. Gips, and P. Fleming. The camera mouse: visual tracking of body features to provide computer access for people with severe disabilities. IEEE T NEUR SYS REH, 10(1):1--10, March 2002.
[2]
Z. Bian, J. Hou, L. Chau, and N. Magnenat-Thalmann. Fall detection based on body part tracking using a depth camera. IEEE J BHI, 2014.
[3]
M. Jose and R. Lopes. Human-computer interface controlled by the lip. IEEE J BHI, 2014.
[4]
T. Morris and V. Chauhan. Facial feature tracking for cursor control. J NETW COMPUT APPL, 29(1):62--80, 2006.
[5]
J. Music, M. Cecic, and B. M. Testing inertial sensor performance as hands-free human-computer interface. WSEAS Trans. Comput., 8:715--724, Apr. 2009.
[6]
A. Savran, B. Sankur, and M. Taha Bilge. Comparative evaluation of 3d vs. 2d modality for automatic detection of facial action units. Pattern Recogn., 45(2):767--782, Feb 2012.
[7]
J. Shotton, A. Fitzgibbon, M. Cook, T. Sharp, M. Finocchio, R. Moore, A. Kipman, and A. Blake. Real-time human pose recognition in parts from single depth images. In CVPR, pages 1297 --1304, june 2011.
[8]
J. Tu, H. Tao, and T. Huang. Face as mouse through visual face tracking. Comput. Vis. Image Underst., 108(1-2):35--40, Oct. 2007.
[9]
B. Yousefi, X. Huo, E. Veledar, and M. Ghovanloo. Quantitative and comparative assessment of learning in a tongue-operated computer input device. IEEE T INF TECHNOL B., 15(5):747--757, Sept 2011.

Cited By

View all
  • (2021)Nadine the Social Robot: Three Case Studies in Everyday LifeSocial Robotics10.1007/978-3-030-90525-5_10(107-116)Online publication date: 2-Nov-2021
  • (2016)Facial Position and Expression-Based Human–Computer Interface for Persons With TetraplegiaIEEE Journal of Biomedical and Health Informatics10.1109/JBHI.2015.241212520:3(915-924)Online publication date: May-2016

Index Terms

  1. Human Computer Interface for Quadriplegic People Based on Face Position/gesture Detection

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    MM '14: Proceedings of the 22nd ACM international conference on Multimedia
    November 2014
    1310 pages
    ISBN:9781450330633
    DOI:10.1145/2647868
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 03 November 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. assistive technology
    2. computer access
    3. hand-free interface
    4. human-computer interaction (hci)
    5. quadriplegic
    6. severe disabilities
    7. vision-based

    Qualifiers

    • Poster

    Funding Sources

    Conference

    MM '14
    Sponsor:
    MM '14: 2014 ACM Multimedia Conference
    November 3 - 7, 2014
    Florida, Orlando, USA

    Acceptance Rates

    MM '14 Paper Acceptance Rate 55 of 286 submissions, 19%;
    Overall Acceptance Rate 2,145 of 8,556 submissions, 25%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)3
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 25 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2021)Nadine the Social Robot: Three Case Studies in Everyday LifeSocial Robotics10.1007/978-3-030-90525-5_10(107-116)Online publication date: 2-Nov-2021
    • (2016)Facial Position and Expression-Based Human–Computer Interface for Persons With TetraplegiaIEEE Journal of Biomedical and Health Informatics10.1109/JBHI.2015.241212520:3(915-924)Online publication date: May-2016

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media