Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3316782.3322774acmotherconferencesArticle/Chapter ViewAbstractPublication PagespetraConference Proceedingsconference-collections
research-article

Feasibility analysis of sensor modalities to control a robot with eye and head movements for assistive tasks

Published: 05 June 2019 Publication History

Abstract

Assistive robotics has offered a way for people with severe motor disabilities (i. e. tetraplegics) to perform every day tasks without help. New sensor modalities to control a robot system are investigated within this work to enable tetraplegics to gain more autonomy in everyday life. In this work several modalities to capture information related to the user are tested and compared. The five sensor modalities, electrooculography, video-based eye tracking, MARG sensors, video-based head tracking and electromyography of the posterior auricular muscle, can be used to control a robot hands-free. It is proposed to use movements of the head as continuous control and eye movements as discrete event control. The tests show that the MARG sensors are most reliable to track head movements and eye tracking glasses to capture movements of the eyes.

References

[1]
Shiva Alsharif. 2017. Gaze-Based Control of Robot Arm in Three-Dimensional Space. PhD Thesis. University Bremen, Bremen, Germany, (Nov. 27, 2017).
[2]
Nathaniel Barbara and Tracey A. Camilleri. 2016. Interfacing with a speller using EOG glasses. In IEEE, (Oct. 2016), 001069--001074. ISBN: 978-1-5090-1897-0.
[3]
Annalies Baumeister, Max Pascher, Roland Thietje, Jens Gerken, and Barbara Klein. 2018. Anforderungen an die Interaktion eines Roboterarms zur Nahrungsaufnahme bei Tetraplegie - Eine ethnografische Analyse. In AAL Kongress Karlsruhe 2018. Karlsruhe, Germany.
[4]
M. Bureau, J. M. Azkoitia, G. Ezmendi, I. Manterola, H. Zabaleta, M. Perez, and J. Medina. 2007. Non-Invasive, Wireless and Universal Interface for the Control of Peripheral Devices by Means of Head Movements. In 2007 IEEE 10th International Conference on Rehabilitation Robotics. 2007 IEEE 10th International Conference on Rehabilitation Robotics. (June 2007), 124--131.
[5]
Andrew T. Duchowski. 2017. Eye Tracking Methodology: Theory and Praxis. Springer International Publishing, Cham. ISBN: 978-3-319-57881-1.
[6]
S. Dziemian, W. W. Abbott, and A. A. Faisal. 2016. Gaze-based teleprosthetic enables intuitive continuous control of complex robot arm use: Writing amp; drawing. In 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob). 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob). (June 2016), 1277--1282.
[7]
I. A. Essa and A. P. Pentland. 1997. Coding, analysis, interpretation, and recognition of facial expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19, 7, (July 1997), 757--763. ISSN: 0162-8828.
[8]
C. L. Fall, P. Turgeon, A. Campeau-Lecours, V. Maheu, M. Boukadoum, S. Roy, D. Massicotte, C. Gosselin, and B. Gosselin. 2015. Intuitive wireless control of a robotic arm for people living with an upper body disability. In 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). (Aug. 2015), 4399-4402.
[9]
A. Gräser, T. Heyer, L. Fotoohi, U. Lange, H. Kampe, B. Enjarini, S. Heyer, C. Fragkopoulos, and D. Ristic-Durrant. 2013. A Supportive FRIEND at Work: Robotic Workplace Assistance for the Disabled. IEEE Robotics Automation Magazine, 20, 4, (Dec. 2013), 148--159. ISSN: 1070-9932.
[10]
Sorin M. Grigorescu, Thorsten Lüth, Christos Fragkopoulos, Marco Cyriacks, and Axel Gräser. 2012. A BCI-controlled robotic assistant for quadriplegic people in domestic and professional life. Robotica, 30, 03, (May 2012), 419--431. ISSN: 0263-5747, 1469-8668.
[11]
A. Jackowski, M. Gebhard, and R. Thietje. 2018. Head Motion and Head Gesture-Based Robot Control: A Usability Study. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 26, 1, (Jan. 2018), 161--170. ISSN: 1534-4320.
[12]
JINS Inc. {n. d.} Device Specifications. Retrieved Aug. 17, 2018 from https://jins-meme.com/en/researchers/specifications/.
[13]
JINS Inc. {n. d.} JINS MEME ES. Retrieved Apr. 16, 2018 from https://jins-meme.com/en/products/es/.
[14]
S. Kanoh, S. Ichi-nohe, S. Shioya, K. Inoue, and R. Kawashima. 2015. Development of an eyewear to measure eye and body movements. In IEEE, (Aug. 2015), 2267--2270. ISBN: 978-1-4244-9271-8.
[15]
Do Hyoung Kim, Jae Hean Kim, Dong Hyun Yoo, Young Jin Lee, and Myung Jin Chung. 2001. A Human-Robot Interface Using Eye-Gaze Tracking System for People with Motor Disabilities. Transaction on Control, Automation, and Systems Engineering, 3, 4, 229--235.
[16]
Hendrik Koesling. 2012. Augenbewegungen Und Visuelle Aufmerksamkeit. Vorlesung WS2011/12. Universität Bielefeld, (Feb. 5, 2012).
[17]
Kai Kunze, Masai Katsutoshi, Yuji Uema, and Masahiko Inami. 2015. How much do you read?: counting the number of words a user reads using electrooculography. In ACM Press, 125--128. ISBN: 978-1-4503-3349-8.
[18]
H. O. Latif, N. Sherkat, and A. Lotfi. 2009. Teleoperation through eye gaze (TeleGaze): A multimodal approach. In 2009 IEEE International Conference on Robotics and Biomimetics (ROBIO). 2009 IEEE International Conference on Robotics and Biomimetics (ROBIO). (Dec. 2009), 711--716.
[19]
Djoko Purwanto, Ronny Mardiyanto, and Kohei Arai. 2009. Electric wheelchair control with gaze direction and eye blinking. Artificial Life and Robotics, 14, 3, (Dec. 2009), 397--400. ISSN: 1433-5298, 1614-7456.
[20]
Nina Rudigkeit, Marion Gebhard, and Axel Gräser. 2014. Towards a User-Friendly AHRS-Based Human-Machine Interface for a Semi-Autonomous Robot. In 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems. Chicago, Illinois, USA, (Sept. 14, 2014).
[21]
L. Schmalfuß et al. 2015. Steer by ear: Myoelectric auricular control of powered wheelchairs for individuals with spinal cord injury. Restorative Neurology and Neuroscience, 34, 1, (Nov. 18, 2015), 79--95. ISSN: 09226028, 18783627.
[22]
SensoMotoric Instruments GmbH. 2017. SMI Eye Tracking Glasses 2 Wireless. Produktbroschüre. Teltow, Germany, (May 10, 2017).
[23]
E. B. Thorp et al. 2016. Upper Body-Based Power Wheelchair Control Interface for Individuals With Tetraplegia. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 24, 2, (Feb. 2016), 249--260. ISSN: 1534-4320.
[24]
Tobii AB. 2017. Tobii Eye Tracker 4C. (Nov. 2, 2017). Retrieved Mar. 5, 2018 from https://tobiigaming.com/eye-tracker-4c/.
[25]
B. Tordoff, W.W. Mayol, D.W. Murray, and T.E. de Campos. 2002. Head pose estimation for wearable robot control. In Procedings of the British Machine Vision Conference 2002. British Machine Vision Conference 2002. British Machine Vision Association, Cardiff, 79.1--79.10. ISBN: 978-1-901725-19-3.
[26]
Jan Wendel. 2011. Integrierte Navigationssysteme: Sensordatenfusion, GPS und inertiale Navigation. (2., überarb. Aufl ed.). Oldenbourg, München. ISBN: 978-3-486-70439-6.
[27]
Matthew R. Williams and Robert F. Kirsch. 2015. Evaluation of head orientation and neck muscle EMG signals as three-dimensional command sources. Journal of Neuro Engineering and Rehabilitation, 12, 1, (Mar. 5, 2015), 25. ISSN: 1743-0003.

Cited By

View all
  • (2024)From Imagination to Innovation: Using Participatory Design Fiction to Envision the Future of Accessible Gaming Wearables for Players with Upper Limb Motor DisabilitiesProceedings of the ACM on Human-Computer Interaction10.1145/36770738:CHI PLAY(1-30)Online publication date: 15-Oct-2024
  • (2021)Learning to Map Degrees of Freedom for Assistive User ControlProceedings of the 14th PErvasive Technologies Related to Assistive Environments Conference10.1145/3453892.3453895(132-139)Online publication date: 29-Jun-2021
  • (2021)Wearable Interactions for Users with Motor Impairments: Systematic Review, Inventory, and Research ImplicationsProceedings of the 23rd International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3441852.3471212(1-15)Online publication date: 17-Oct-2021

Index Terms

  1. Feasibility analysis of sensor modalities to control a robot with eye and head movements for assistive tasks

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    PETRA '19: Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments
    June 2019
    655 pages
    ISBN:9781450362320
    DOI:10.1145/3316782
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 05 June 2019

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. HRI
    2. MARG sensor
    3. assistive robotics
    4. eye tracker
    5. head tracker
    6. human robot interaction
    7. sensor modalities

    Qualifiers

    • Research-article

    Funding Sources

    • German Federal Ministry of Education and Research (BMBF)

    Conference

    PETRA '19

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)10
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 26 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)From Imagination to Innovation: Using Participatory Design Fiction to Envision the Future of Accessible Gaming Wearables for Players with Upper Limb Motor DisabilitiesProceedings of the ACM on Human-Computer Interaction10.1145/36770738:CHI PLAY(1-30)Online publication date: 15-Oct-2024
    • (2021)Learning to Map Degrees of Freedom for Assistive User ControlProceedings of the 14th PErvasive Technologies Related to Assistive Environments Conference10.1145/3453892.3453895(132-139)Online publication date: 29-Jun-2021
    • (2021)Wearable Interactions for Users with Motor Impairments: Systematic Review, Inventory, and Research ImplicationsProceedings of the 23rd International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3441852.3471212(1-15)Online publication date: 17-Oct-2021

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media