Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3371382.3377434acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
abstract
Public Access

Eye Gaze for Assistive Manipulation

Published: 01 April 2020 Publication History

Abstract

A key challenge of human-robot collaboration is to build systems that balance the usefulness of autonomous robot behaviors with the benefits of direct human control. This balance is especially relevant for assistive manipulation systems, which promise to help people with disabilities more easily control wheelchair-mounted robot arms to accomplish activities of daily living. To provide useful assistance, robots must understand the user's goals and preferences for the task. Our insight is that systems can enhance this understanding by monitoring the user's natural eye gaze behavior, as psychology research has shown that eye gaze is responsive and relevant to the task. In this work, we show how using gaze enhances assistance algorithms. First, we analyze eye gaze behavior during teleoperated robot manipulation and compare it to literature results on by-hand manipulation. Then, we develop a pipeline for combining the raw eye gaze signal with the task context to build a rich signal for learning algorithms. Finally, we propose a novel use of eye gaze in which the robot avoids risky behavior by detecting when the user believes that the robot's behavior has a problem.

References

[1]
Reuben M. Aronson and Henny Admoni. 2018. Gaze for Error Detection During Human-Robot Shared Manipulation. In Fundamentals of Joint Action workshop, Robotics: Science and Systems.
[2]
Reuben M. Aronson and Henny Admoni. 2019. Semantic gaze labeling for humanrobot shared manipulation. In Eye Tracking Research and Applications Symposium (ETRA). Association for Computing Machinery. https://doi.org/10.1145/3314111. 3319840
[3]
Reuben M. Aronson, Thiago Santini, Thomas. C. Kübler, Enkelejda Kasneci, Siddhartha Srinivasa, and Henny Admoni. 2018. Eye-Hand Behavior in Human- Robot Shared Manipulation. In ACM/IEEE International Conference on Human- Robot Interaction.
[4]
Matthias Bernhard, Efstathios Stavrakis, Michael Hecher, and Michael Wimmer. 2014. Gaze-to-Object Mapping During Visual Search in 3D Virtual Environments. ACM Trans. Appl. Percept. 11, 3, Article 14 (Aug. 2014), 17 pages. https://doi.org/ 10.1145/2644812
[5]
Yu Chen and D. H. Ballard. 2002. Learning to recognize human action sequences. In Proceedings - 2nd International Conference on Development and Learning, ICDL 2002.
[6]
Çala Ç and Tevfik Metin Sezgin. 2015. Gaze-based prediction of pen-based virtual interaction tasks. International Journal of Human Computer Studies 73 (2015), 91--106.
[7]
Anca D. Dragan, Kenton C.T. Lee, and Siddhartha S. Srinivasa. 2013. Legibility and Predictability of Robot Motion. In Proceedings of the 8th ACM/IEEE International Conference on Human-robot Interaction (HRI '13). IEEE Press, Piscataway, NJ, USA, 301--308. http://dl.acm.org/citation.cfm?id=2447556.2447672
[8]
Anca D Dragan and Siddhartha S Srinivasa. 2013. A policy-blending formalism for shared control. The International Journal of Robotics Research 32, 7 (2013), 790--805. https://doi.org/10.1177/0278364913490324 arXiv:https://doi.org/10.1177/0278364913490324
[9]
Elena Corina Grigore, Kerstin Eder, Anthony G. Pipe, Chris Melhuish, and Ute Leonards. 2013. Joint action understanding improves robot-to-human object handover. In IEEE/RSJ International Conference on Intelligent Robots and Systems.
[10]
Chien-Ming Huang and Bilge Mutlu. 2016. Anticipatory robot control for efficient human-robot collaboration. In ACM/IEEE International Conference on Human- Robot Interaction. 83--90.
[11]
Siddarth Jain and Brenna Argall. 2018. Recursive Bayesian Human Intent Recognition in Shared-Control Robotics. IROS (2018).
[12]
Shervin Javdani, Henny Admoni, Stefania Pellegrinelli, Siddhartha S. Srinivasa, and J. Andrew Bagnell. 2018. Shared autonomy via hindsight optimization for teleoperation and teaming. The International Journal of Robotics Research 37, 7 (2018), 717--742. https://doi.org/10.1177/0278364918776060 arXiv:https://doi.org/10.1177/0278364918776060
[13]
Roland S Johansson, Gö Ran Westling, Anders Bä Ckströ, and J Randall Flanagan. 2001. Eye--Hand Coordination in Object Manipulation. The Journal of Neuroscience 21, 17 (2001), 6917--6932.
[14]
Kinova Robotics, Inc. 2020. Robot arms. Retrieved Jan 13, 2018 from http: //www.kinovarobotics.com/assistive-robotics/products/robot-arms/
[15]
Thomas C. Kübler, Colleen Rothe, Ulrich Schiefer, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2017. SubsMatch 2.0: Scanpath comparison and classification based on subsequence frequencies. Behavior Research Methods 49, 3 (2017), 1048-- 1064.
[16]
Michael F. Land and Mary Hayhoe. 2001. In what ways do eye movements contribute to everyday activities? Vision Research 41, 25 (2001), 3559--3565.
[17]
Jonathan Samir Matthis, Jacob L. Yates, and Mary M. Hayhoe. 2008. Gaze and the Control of Foot Placement When Walking in Natural Terrain. Current Biology (2008).
[18]
Katharina Muelling, Arun Venkatraman, Jean-Sebastien Valois, John E. Downey, Jeffrey Weiss, Shervin Javdani, Martial Hebert, Andrew B. Schwartz, Jennifer L. Collinger, and J. Andrew Bagnell. 2017. Autonomy infused teleoperation with application to brain computer interface controlled manipulation. Autonomous Robots 41, 6 (01 Aug 2017), 1401--1422. https://doi.org/10.1007/s10514-017--9622--4
[19]
Benjamin A. Newman, Reuben M. Aronson, Siddhartha S. Srinivasa, Kris Kitani, and Henny Admoni. 2018. HARMONIC: A Multimodal Dataset of Assistive Human-Robot Collaboration. ArXiv e-prints (July 2018). arXiv:cs.RO/1807.11154
[20]
Thies Pfeiffer and Patrick Renner. 2014. EyeSee3D: A Low-cost Approach for Analyzing Mobile 3D Eye Tracking Data Using Computer Vision and Augmented Reality Technology. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, New York, NY, USA, 369--376. https://doi. org/10.1145/2578153.2628814
[21]
Thies Pfeiffer, Patrick Renner, and Nadine Pfeiffer-Leßmann. 2016. EyeSee3D 2.0: Model-based Real-time Analysis of Mobile Eye-tracking in Static and Dynamic Three-dimensional Scenes. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (ETRA '16). ACM, New York, NY, USA, 189--196. https://doi.org/10.1145/2857491.2857532
[22]
Siddharth Reddy, Anca D. Dragan, and Sergey Levine. 2018. Shared Autonomy via Deep Reinforcement Learning. Robotics: Science and Systems (2018).

Cited By

View all
  • (2023)Universal Design of Gaze Interactive Applications for People with Special NeedsProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589666(1-7)Online publication date: 30-May-2023
  • (2023)Stargazer: An Interactive Camera Robot for Capturing How-To Videos Based on Subtle Instructor CuesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580896(1-16)Online publication date: 19-Apr-2023
  • (2023)Characterizing Eye Gaze for Assistive Device Control2023 International Conference on Rehabilitation Robotics (ICORR)10.1109/ICORR58425.2023.10304812(1-6)Online publication date: 24-Sep-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
HRI '20: Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction
March 2020
702 pages
ISBN:9781450370578
DOI:10.1145/3371382
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 April 2020

Check for updates

Author Tags

  1. eye gaze
  2. human-robot collaboration
  3. human-robot interaction
  4. physical assistance
  5. shared control

Qualifiers

  • Abstract

Funding Sources

Conference

HRI '20
Sponsor:

Acceptance Rates

Overall Acceptance Rate 192 of 519 submissions, 37%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)203
  • Downloads (Last 6 weeks)27
Reflects downloads up to 15 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Universal Design of Gaze Interactive Applications for People with Special NeedsProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589666(1-7)Online publication date: 30-May-2023
  • (2023)Stargazer: An Interactive Camera Robot for Capturing How-To Videos Based on Subtle Instructor CuesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580896(1-16)Online publication date: 19-Apr-2023
  • (2023)Characterizing Eye Gaze for Assistive Device Control2023 International Conference on Rehabilitation Robotics (ICORR)10.1109/ICORR58425.2023.10304812(1-6)Online publication date: 24-Sep-2023
  • (2023)Specifying Target Objects in Robot Teleoperation Using Speech and Natural Eye Gaze2023 IEEE-RAS 22nd International Conference on Humanoid Robots (Humanoids)10.1109/Humanoids57100.2023.10375186(1-7)Online publication date: 12-Dec-2023
  • (2023)Interface using eye-gaze and tablet input for an avatar robot control in class participation support systemComputers and Electrical Engineering10.1016/j.compeleceng.2023.108914111:PAOnline publication date: 1-Oct-2023
  • (2022)Perception-Motion Coupling in Active Telepresence: Human Behavior and Teleoperation Interface DesignACM Transactions on Human-Robot Interaction10.1145/357159912:3(1-24)Online publication date: 18-Nov-2022
  • (2022)Perceptions of the Helpfulness of Unexpected Agent AssistanceProceedings of the 10th International Conference on Human-Agent Interaction10.1145/3527188.3561915(41-50)Online publication date: 5-Dec-2022
  • (2022)Head Pose for Object Deixis in VR-Based Human-Robot Interaction2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)10.1109/RO-MAN53752.2022.9900631(610-617)Online publication date: 29-Aug-2022
  • (2022)A Dedicated Tool Frame Based Tongue Interface Layout Improves 2D Visual Guided Control of an Assistive Robotic Manipulator: A Design Parameter for Tele-ApplicationsIEEE Sensors Journal10.1109/JSEN.2022.316455122:10(9868-9880)Online publication date: 15-May-2022
  • (2022)Depth-aware gaze-following via auxiliary networks for roboticsEngineering Applications of Artificial Intelligence10.1016/j.engappai.2022.104924113:COnline publication date: 22-Jun-2022
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media