Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2559636.2559672acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
research-article

Communication of intent in assistive free flyers

Published: 03 March 2014 Publication History

Abstract

Assistive free-flyers (AFFs) are an emerging robotic platform with unparalleled flight capabilities that appear uniquely suited to exploration, surveillance, inspection, and telepresence tasks. However, unconstrained aerial movements may make it difficult for colocated operators, collaborators, and observers to understand AFF intentions, potentially leading to difficulties understanding whether operator instructions are being executed properly or to safety concerns if future AFF motions are unknown or difficult to predict. To increase AFF usability when working in close proximity to users, we explore the design of natural and intuitive flight motions that may improve AFF abilities to communicate intent while simultaneously accomplishing task goals. We propose a formalism for representing AFF flight paths as a series of motion primitives and present two studies examining the effects of modifying the trajectories and velocities of these flight primitives based on natural motion principles. Our first study found that modified flight motions might allow AFFs to more effectively communicate intent and, in our second study, participants preferred interacting with an AFF that used a manipulated flight path, rated modified flight motions as more natural, and felt safer around an AFF with modified motion. Our proposed formalism and findings highlight the importance of robot motion in achieving effective human-robot interactions.

References

[1]
T. Asfour and R. Dillmann. Human-like motion of a humanoid robot arm based on a closed-form solution of the inverse kinematics problem. In Proc IROS'03, volume 2, pages 1407--1412, 2003.
[2]
G. R. Bergersen, J. E. Hannay, D. I. Sjoberg, T. Dyba, and A. Karahasanovic. Inferring skill from tests of programming performance: Combining time and quality. In Proc ESEM'11, pages 305--314, 2011.
[3]
S. S. Bueno, J. R. Azinheira, J. Ramos Jr, E. C. d. Paiva, P. Rives, A. Elfes, J. R. Carvalho, G. F. Silveira, et al. Project aurora: Towards an autonomous robotic airship. In Proc IROS'02., pages 43--54, 2002.
[4]
J. Cohen. Statistical power analysis for the behavioral sciencies. Routledge, 1988.
[5]
A. D. Dragan, K. C. Lee, and S. S. Srinivasa. Legibility and predictability of robot motion. In Proc HRI'13, pages 301--308, 2013.
[6]
A. Elfes, S. Siqueira Bueno, M. Bergerman, and J. Ramos Jr. A semi-autonomous robotic airship for environmental monitoring missions. In Proc ICRA'98., volume 4, pages 3449--3455, 1998.
[7]
F. Ferland, F. Pomerleau, C. T. Le Dinh, and F. Michaud. Egocentric and exocentric teleoperation interface using real-time, 3d video projection. In Proc HRI'09, pages 37--44, 2009.
[8]
T. Fong, R. Berka, M. Bualat, M. Diftler, M. Micire, D. Mittman, V. SunSpiral, and C. Provencher. The human exploration telerobotics project. In Proc Global Space Exploration Conference, 2012.
[9]
T. Fong, M. Micire, T. Morse, E. Park, C. Provencher, V. To, D. Wheeler, D. Mittman, R. J. Torres, and E. Smith. Smart spheres: a telerobotic free-flyer for intravehicular activities in space. In Proc. AIAA Space'13, 2013.
[10]
M. Garcia, J. Dingliana, and C. O'Sullivan. Perceptual evaluation of cartoon physics: accuracy, attention, appeal. In Proc. APGV'08, pages 107--114, 2008.
[11]
M. J. Gielniak, C. K. Liu, and A. L. Thomaz. Secondary action in robot motion. In IEEE RO-MAN'10, pages 310--315, 2010.
[12]
M. J. Gielniak, C. K. Liu, and A. L. Thomaz. Stylized motion generalization through adaptation of velocity profiles. In IEEE RO-MAN'10, pages 304--309, 2010.
[13]
M. J. Gielniak and A. L. Thomaz. Anticipation in robot motion. In IEEE RO-MAN'11, pages 449--454, 2011.
[14]
M. J. Gielniak and A. L. Thomaz. Enhancing interaction through exaggerated motion synthesis. In Proc HRI'12, pages 375--382, 2012.
[15]
M. A. Goodrich, B. S. Morse, D. Gerhardt, J. L. Cooper, M. Quigley, J. A. Adams, and C. Humphrey. Supporting wilderness search and rescue using a camera-equipped mini uav. Journal of Field Robotics, 25(1--2):89--110, 2008.
[16]
J. Irizarry, M. Gheisari, and B. N. Walker. Usability assessment of drone technology as safety inspection tools. Journal of ITcon'12, 17:194--212, 2012.
[17]
K. Kamewari, M. Kato, T. Kanda, H. Ishiguro, and K. Hiraki. Six-and-a-half-month-old children positively attribute goals to human action and to humanoid-robot motion. Cognitive Development, 20(2):303--320, 2005.
[18]
H. Kidokoro, T. Kanda, D. Brscic, and M. Shiomi. Will I bother here? A robot anticipating its influence on pedestrian walking comfort. In Proc HRI'13, pages 259--266, 2013.
[19]
J.-H. Kim, J.-J. Choi, H. J. Shin, and I.-K. Lee. Anticipation effect generation for character animation. In Advances in Computer Graphics, pages 639--646. 2006.
[20]
R. Kirby. Social Robot Navigation. PhD thesis, Robotics Institute, Carnegie Mellon University, Pittsburgh, PA, May 2010.
[21]
T. Krajník, V. Vonásek, D. Fišer, and J. Faigl. Ar-drone as a platform for robotic research and education. In Research and Education in Robotics-EUROBOT 2011, pages 172--186. 2011.
[22]
J. Krasner. Motion graphic design: applied history and aesthetics. Focal Press, 2013.
[23]
D. Kulic and E. Croft. Physiological and subjective responses to articulated robot motion. Robotica, 25(01):13--27, 2007.
[24]
J.-Y. Kwon and I.-K. Lee. An animation bilateral filter for slow-in and slow-out effects. Graphical Models, 73(5):141--150, 2011.
[25]
J. Lasseter. Principles of traditional animation applied to 3d computer animation. In Proc SIGGRAPH'87, volume 21, pages 35--44, 1987.
[26]
C. Lichtenthaler, T. Lorenz, M. Karg, and A. Kirsch. Increasing perceived value between human and robots -- measuring legibility in human aware navigation. In Proc ARSO'12, pages 89--94, 2012.
[27]
J. Mainprice, E. Akin Sisbot, L. Jaillet, J. Cortés, R. Alami, and T. Siméon. Planning human-aware motions using a sampling-based costmap planner. In Proc ICRA'11, pages 5012--5017, 2011.
[28]
J. Mainprice, M. Gharbi, T. Siméon, and R. Alami. Sharing effort in planning human-robot handover tasks. In IEEE RO-MAN'12, pages 764--770, 2012.
[29]
C. L. Nehaniv, K. Dautenhahn, J. Kubacki, M. Haegele, C. Parlitz, and R. Alami. A methodological approach relating the classification of gesture to identification of human intent in the context of human-robot interaction. In IEEE RO-MAN'05, pages 371--377, 2005.
[30]
W. S. Ng and E. Sharlin. Collocated interaction with flying robots. In IEEE RO-MAN'11, pages 143--149, 2011.
[31]
E. Pacchierotti, H. I. Christensen, and P. Jensfelt. Human-robot embodied interaction in hallway settings: a pilot user study. In IEEE RO-MAN'05, pages 164--171, 2005.
[32]
K. Pfeil, S. L. Koh, and J. LaViola. Exploring 3d gesture metaphors for interaction with unmanned aerial vehicles. In Proc IUI'13, pages 257--266, 2013.
[33]
T. Ribeiro and A. Paiva. The illusion of robotic life: principles and practices of animation for robots. In Proc HRI'12, pages 383--390, 2012.
[34]
M. Saerbeck and C. Bartneck. Perception of affect elicited by robot motion. In Proc HRI'10, pages 53--60, 2010.
[35]
M. Saerbeck and A. J. van Breemen. Design guidelines and tools for creating believable motion for personal robots. In IEEE RO-MAN'07, pages 386--391, 2007.
[36]
L. Scandolo and T. Fraichard. An anthropomorphic navigation scheme for dynamic scenarios. In Proc ICRA'11, pages 809--814, 2011.
[37]
N. Sharkey. Death strikes from the sky: the calculus of proportionality. Technology and Society Magazine, IEEE, 28(1):16--19, 2009.
[38]
M. Sharma, D. Hildebrandt, G. Newman, J. E. Young, and R. Eskicioglu. Communicating affect via flight path exploring use of the laban effort system for designing affective locomotion paths. In Proc HRI'13, pages 293--300, 2013.
[39]
K. Shoemake. Animating rotation with quaternion curves. In Proc SIGGRAPH'85, volume 19, pages 245--254, 1985.
[40]
L. Takayama, D. Dooley, and W. Ju. Expressing thought: improving robot readability with animation principles. In Proc HRI'11, pages 69--76, 2011.
[41]
J. G. Trafton, N. L. Cassimatis, M. D. Bugajska, D. P. Brock, F. E. Mintz, and A. C. Schultz. Enabling effective human-robot interaction using perspective-taking in robots. Systems, Man and Cybernetics, IEEE Trans on, 35(4):460--470, 2005.
[42]
A. Van Breemen. Bringing robots to life: Applying principles of animation to robots. In Proc CHI'04, 2004.
[43]
A. J. van Breemen. Animation engine for believable interactive user-interface robots. In Proc IROS'04, volume 3, pages 2873--2878, 2004.
[44]
B. Wang, X. Chen, Q. Wang, L. Liu, H. Zhang, and B. Li. Power line inspection with a flying robot. In Proc CARPI'10, pages 1--6, 2010.

Cited By

View all
  • (2024)Physically Assistive Robots: A Systematic Review of Mobile and Manipulator Robots That Physically Assist People with DisabilitiesAnnual Review of Control, Robotics, and Autonomous Systems10.1146/annurev-control-062823-0243527:1(123-147)Online publication date: 10-Jul-2024
  • (2024)Towards the Legibility of Multi-robot SystemsACM Transactions on Human-Robot Interaction10.1145/364798413:2(1-32)Online publication date: 14-Jun-2024
  • (2024)Exploring Intended Functions of Indoor Flying Robots Interacting With Humans in ProximityProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642791(1-16)Online publication date: 11-May-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
HRI '14: Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction
March 2014
538 pages
ISBN:9781450326582
DOI:10.1145/2559636
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 03 March 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. assistive free-flyer (aff)
  2. design
  3. flying robot
  4. human factors
  5. intent
  6. micro air vehicle (mav)
  7. motion
  8. usability

Qualifiers

  • Research-article

Conference

HRI'14
Sponsor:

Acceptance Rates

HRI '14 Paper Acceptance Rate 32 of 132 submissions, 24%;
Overall Acceptance Rate 268 of 1,124 submissions, 24%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)97
  • Downloads (Last 6 weeks)11
Reflects downloads up to 03 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Physically Assistive Robots: A Systematic Review of Mobile and Manipulator Robots That Physically Assist People with DisabilitiesAnnual Review of Control, Robotics, and Autonomous Systems10.1146/annurev-control-062823-0243527:1(123-147)Online publication date: 10-Jul-2024
  • (2024)Towards the Legibility of Multi-robot SystemsACM Transactions on Human-Robot Interaction10.1145/364798413:2(1-32)Online publication date: 14-Jun-2024
  • (2024)Exploring Intended Functions of Indoor Flying Robots Interacting With Humans in ProximityProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642791(1-16)Online publication date: 11-May-2024
  • (2023)Exploring the Design Space of Extra-Linguistic Expression for RobotsProceedings of the 2023 ACM Designing Interactive Systems Conference10.1145/3563657.3595968(2689-2706)Online publication date: 10-Jul-2023
  • (2023)At First Light: Expressive Lights in Support of Drone-Initiated CommunicationProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581062(1-17)Online publication date: 19-Apr-2023
  • (2023)How to Communicate Robot Motion Intent: A Scoping ReviewProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580857(1-17)Online publication date: 19-Apr-2023
  • (2022)Virtual, Augmented, and Mixed Reality for HRI (VAM-HRI)Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction10.5555/3523760.3523991(1237-1240)Online publication date: 7-Mar-2022
  • (2022)Generative Adversarial Networks and Data Clustering for Likable Drone DesignSensors10.3390/s2217643322:17(6433)Online publication date: 26-Aug-2022
  • (2021)A Model for Tacit Communication in Collaborative Human-UAV Search-and-RescueEntropy10.3390/e2308102723:8(1027)Online publication date: 10-Aug-2021
  • (2020)Drone Chi: Somaesthetic Human-Drone InteractionProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376786(1-13)Online publication date: 21-Apr-2020

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media