Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3537972.3537985acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmocoConference Proceedingsconference-collections
research-article
Open access

RoboGroove: Creating Fluid Motion for Dancing Robotic Arms

Published: 30 June 2022 Publication History

Abstract

Robotic motion has been studied for many purposes, such as effective fast movements, communicative gestures, and obstacle avoidance. Through this study, we are able to improve the perceived expressivity of a robot performing a task by generating trajectories. So far, robots have been very rigid in their movements, making them feel more robotic and less human. The concept of follow through, and smooth movement, can be used to increase animacy for a robot which leads to a more lifelike performance. In this paper, we describe the use of follow through to improve dances and grooves that can match the elegance of human dancers. We created two techniques using a non-humanoid robotic arm, to simulate this for a robotic dancer. Our first technique uses forward kinematics with trajectories that are generated based on a dancer moving to a beat. We mapped various movements of a human dancer to a set of joints on a robotic arm to generate the dancing trajectory. This technique allows a robot to dance in real-time with a human dancer, and also create its own smooth trajectory. The time delay that each human body part uses is implemented as time delay in the robot movements. The second method uses impedance control with varied damping parameters to create follow through. Robotic joints with low damping can passively move in response to other movements in a robot. The arm had high damping for any active moving joints, while the rest of the arm would passively react to this excitation; similar to how a human body responds to our own movement. These two methods were compared in a study where the robot would dance to a beat and users would qualitatively and quantitatively rate the robotic movement. The results of the survey showed that both methods provided an increase in animacy and anthropomorphism of a robot dancing to a beat. Impedance control had the highest rating for animacy, anthropomorphism, full body, and individual body rating. The use of impedance can provide a simple way for a robot to dance like a human, without a change to the primary trajectory.

References

[1]
Fares J Abu-Dakka and Matteo Saveriano. 2020. Variable impedance control and learning—a review. Frontiers in Robotics and AI(2020), 177.
[2]
Liz Rincon Ardila, Enrique Coronado, Hansen Hendra, Julyando Phan, Zur Zainalkefli, and Gentiane Venture. 2019. Adaptive fuzzy and predictive controllers for expressive robot arm movement during human and environment interaction. International Journal of Mechanical Engineering and Robotics Research 8, 2(2019), 207–219.
[3]
Thibault Asselborn, Wafa Johal, and Pierre Dillenbourg. 2017. Keep on moving! Exploring anthropomorphic effects of motion during idle moments. In 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, 897–902.
[4]
Eleanor Avrunin, Justin Hart, Ashley Douglas, and Brian Scassellati. 2011. Effects Related to Synchrony and Repertoire in Perceptions of Robot Dance. In Proceedings of the 6th International Conference on Human-Robot Interaction (Lausanne, Switzerland) (HRI ’11). Association for Computing Machinery, New York, NY, USA, 93–100. https://doi.org/10.1145/1957656.1957678
[5]
Christoph Bartneck, Takayuki Kanda, Omar Mubin, and Abdullah Al Mahmud. 2007. The perception of animacy and intelligence based on a robot’s embodiment. In 2007 7th IEEE-RAS International Conference on Humanoid Robots. IEEE, 300–305.
[6]
Christoph Bartneck, Dana Kulić, Elizabeth Croft, and Susana Zoghbi. 2009. Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International journal of social robotics 1, 1 (2009), 71–81.
[7]
Christopher Bodden, Daniel Rakita, Bilge Mutlu, and Michael Gleicher. 2016. Evaluating intent-expressive robot arm motion. In 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, 658–663.
[8]
Bay-Wei Chang and David Ungar. 1993. Animation: from cartoons to the user interface. In Proceedings of the 6th annual ACM symposium on User interface software and technology. 45–55.
[9]
Anca D Dragan, Shira Bauman, Jodi Forlizzi, and Siddhartha S Srinivasa. 2015. Effects of robot motion on human-robot collaboration. In 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 51–58.
[10]
Anca D. Dragan, Shira Bauman, Jodi Forlizzi, and Siddhartha S. Srinivasa. 2015. Effects of Robot Motion on Human-Robot Collaboration. In 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI). 51–58.
[11]
Yishen Guan, Kazuhito Yokoi, Olivier Stasse, and Abderrahmane Kheddar. 2005. On robotic trajectory planning using polynomial interpolations. In 2005 IEEE International Conference on Robotics and Biomimetics-ROBIO. IEEE, 111–116.
[12]
Peter A Hancock, Deborah R Billings, and Kristen E Schaefer. 2011. Can you trust your robot?Ergonomics in Design 19, 3 (2011), 24–29.
[13]
Fritz Heider and Marianne Simmel. 1944. An experimental study of apparent behavior. The American journal of psychology 57, 2 (1944), 243–259.
[14]
Chin-Chang Ho and Karl F MacDorman. 2010. Revisiting the uncanny valley theory: Developing and validating an alternative to the Godspeed indices. Computers in Human Behavior 26, 6 (2010), 1508–1518.
[15]
Guy Hoffman and Gil Weinberg. 2010. Gesture-based human-robot Jazz improvisation. In 2010 IEEE International Conference on Robotics and Automation. 582–587. https://doi.org/10.1109/ROBOT.2010.5509182
[16]
Neville Hogan. 1984. Impedance control: An approach to manipulation. In 1984 American control conference. IEEE, 304–313.
[17]
Donna Krasnow and Steven J Chatfield. 2009. Development of the “performance competence evaluation measure”: assessing qualitative aspects of dance performance. Journal of Dance Medicine & Science 13, 4 (2009), 101–107.
[18]
Przemyslaw A Lasota and Julie A Shah. 2015. Analyzing the effects of human-aware motion planning on close-proximity human–robot collaboration. Human factors 57, 1 (2015), 21–33.
[19]
Courgeon Matthieu and Duhaut Dominique. 2015. Artificial companions as personal coach for children: the interactive drums teacher. In Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology. 1–4.
[20]
Alexis Meneses, Yuichiro Yoshikawa, and Hiroshi Ishiguro. 2021. Effect of synchronous robot motion on human synchrony and enjoyment perception. Interaction Studies 22, 1 (2021), 86–109.
[21]
Benedikt Merz, Alexandre N. Tuch, and Klaus Opwis. 2016. Perceived User Experience of Animated Transitions in Mobile User Interfaces. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (San Jose, California, USA) (CHI EA ’16). Association for Computing Machinery, New York, NY, USA, 3152–3158. https://doi.org/10.1145/2851581.2892489
[22]
Marek P Michalowski, Selma Sabanovic, and Hideki Kozima. 2007. A dancing robot for rhythmic social interaction. In Proceedings of the ACM/IEEE international conference on Human-robot interaction. 89–96.
[23]
Takashi Minato and Hiroshi Ishiguro. 2008. Construction and Evaluation of a Model of Natural Human Motion Based on Motion Diversity. In Proceedings of the 3rd ACM/IEEE International Conference on Human Robot Interaction (Amsterdam, The Netherlands) (HRI ’08). Association for Computing Machinery, New York, NY, USA, 65–72. https://doi.org/10.1145/1349822.1349832
[24]
Roger K Moore. 2017. Is spoken language all-or-nothing? Implications for future speech-based human-machine interaction. In Dialogues with Social Robots. Springer, 281–291.
[25]
Khoi D Nguyen, Neelima Sharma, and Madhusudhan Venkadesan. 2018. Active viscoelasticity of sarcomeres. Frontiers in Robotics and AI 5 (2018), 69.
[26]
Jimmy Or and Atsuo Takanishi. 2007. Effect of a flexible spine emotional belly dancing robot on human perceptions. International Journal of Humanoid Robotics 4, 01 (2007), 21–48.
[27]
Frank Papenmeier, Meike Uhrig, and Alexandra Kirsch. 2019. Human understanding of robot motion: the role of velocity and orientation. International Journal of Social Robotics 11, 1 (2019), 75–88.
[28]
Jakob Reinhardt, Aaron Pereira, Dario Beckert, and Klaus Bengler. 2017. Dominance and movement cues of robot motion: A user study on trust and predictability. In 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC). 1493–1498. https://doi.org/10.1109/SMC.2017.8122825
[29]
Thomas J Roberts and Emanuel Azizi. 2011. Flexible mechanisms: the diverse roles of biological springs in vertebrate movement. Journal of experimental biology 214, 3 (2011), 353–361.
[30]
Martin Saerbeck and Christoph Bartneck. 2010. Perception of affect elicited by robot motion. In 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI). 53–60. https://doi.org/10.1109/HRI.2010.5453269
[31]
Richard Savery. 2021. Machine learning driven musical improvisation for mechanomorphic human-robot interaction. In Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction. 559–561.
[32]
Richard Savery, Amit Rogel, and Gil Weinberg. 2021. Emotion Musical Prosody for Robotic Groups and Entitativity. In 2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN). IEEE, 440–446.
[33]
Richard Savery, Ryan Rose, and Gil Weinberg. 2019. Establishing human-robot trust through music-driven robotic emotion prosody and gesture. In 2019 28th IEEE international conference on robot and human interactive communication (RO-MAN). IEEE, 1–7.
[34]
Richard Savery and Gil Weinberg. 2020. A Survey of Robotics and Emotion: Classifications and Models of Emotional Interaction. In 29th IEEE International Conference on Robot & Human Interactive Communication.
[35]
Richard Savery and Gil Weinberg. 2021. Robots and emotion: a survey of trends, classifications, and forms of interaction. Advanced Robotics 35, 17 (2021), 1030–1042.
[36]
Richard Savery, Lisa Zahray, and Gil Weinberg. 2020. Shimon the Rapper: A Real-Time System for Human-Robot Interactive Rap Battles. In International Conference on Computational Creativity, ICCC20.
[37]
Richard Savery, Lisa Zahray, and Gil Weinberg. 2021. Shimon Sings-Robotic Musicianship Finds Its Voice. In Handbook of Artificial Intelligence for Music. Springer, Cham, 823–847.
[38]
Trenton Schulz, Jim Torresen, and Jo Herstad. 2019. Animation techniques in human-robot interaction user studies: a systematic literature review. ACM Transactions on Human-Robot Interaction (THRI) 8, 2 (2019), 1–22.
[39]
Ibrahim S Tholley, Qing Gang Meng, and Paul WH Chung. 2012. Robot dancing: what makes a dance?. In Advanced Materials Research, Vol. 403. Trans Tech Publ, 4901–4909.
[40]
Frank Thomas, Ollie Johnston, and Frank Thomas. 1995. The illusion of life: Disney animation. Hyperion New York.
[41]
AJN Van Breemen. 2004. Bringing robots to life: Applying principles of animation to robots. In Proceedings of Shapping Human-Robot Interaction workshop held at CHI, Vol. 2004. Citeseer, 143–144.
[42]
Fangkai Yang, Wenjie Yin, Mårten Björkman, and Christopher Peters. 2020. Impact of trajectory generation methods on viewer perception of robot approaching group behaviors. In 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 509–516.
[43]
Allan Zhou, Dylan Hadfield-Menell, Anusha Naaabandi, and Anca D Dragan. 2017. Expressive robot motion timing. In 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI. IEEE, 22–31.

Cited By

View all
  • (2024)Music, body, and machine: gesture-based synchronization in human-robot musical interactionFrontiers in Robotics and AI10.3389/frobt.2024.146161511Online publication date: 5-Dec-2024
  • (2024)Dances with Drones: Spatial Matching and Perceived Agency in Improvised Movements with Drone and Human PartnersProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642345(1-16)Online publication date: 11-May-2024
  • (2024)Mapping Music onto Robot Joints for Autonomous Choreographies: PCA-Based Approach2024 IEEE 8th Forum on Research and Technologies for Society and Industry Innovation (RTSI)10.1109/RTSI61910.2024.10761717(232-237)Online publication date: 18-Sep-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
MOCO '22: Proceedings of the 8th International Conference on Movement and Computing
June 2022
262 pages
ISBN:9781450387163
DOI:10.1145/3537972
This work is licensed under a Creative Commons Attribution International 4.0 License.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 30 June 2022

Check for updates

Author Tags

  1. fluid motion
  2. follow through
  3. groove
  4. impedance control
  5. robot dancing
  6. robot perception
  7. robotics

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

MOCO '22

Acceptance Rates

Overall Acceptance Rate 85 of 185 submissions, 46%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)384
  • Downloads (Last 6 weeks)47
Reflects downloads up to 04 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Music, body, and machine: gesture-based synchronization in human-robot musical interactionFrontiers in Robotics and AI10.3389/frobt.2024.146161511Online publication date: 5-Dec-2024
  • (2024)Dances with Drones: Spatial Matching and Perceived Agency in Improvised Movements with Drone and Human PartnersProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642345(1-16)Online publication date: 11-May-2024
  • (2024)Mapping Music onto Robot Joints for Autonomous Choreographies: PCA-Based Approach2024 IEEE 8th Forum on Research and Technologies for Society and Industry Innovation (RTSI)10.1109/RTSI61910.2024.10761717(232-237)Online publication date: 18-Sep-2024
  • (2024)Multi-Point Mapping of Dancer Aesthetic Movements Onto a Robotic ArmIEEE Access10.1109/ACCESS.2024.350495412(177723-177734)Online publication date: 2024
  • (2024)“It’s Like Being on Stage”: Staging an Improvisational Haptic-Installed Contemporary Dance PerformanceHaptics: Understanding Touch; Technology and Systems; Applications and Interaction10.1007/978-3-031-70058-3_41(507-518)Online publication date: 30-Jun-2024
  • (2023)A PCA-based Method to Map Aesthetic Movements from Dancer to Robotic Arm2023 IEEE International Conference on Advanced Robotics and Its Social Impacts (ARSO)10.1109/ARSO56563.2023.10187492(71-77)Online publication date: 5-Jun-2023

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media