Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article
Open access

Animation Techniques in Human-Robot Interaction User Studies: A Systematic Literature Review

Published: 03 June 2019 Publication History

Abstract

There are many different ways a robot can move in Human-Robot Interaction. One way is to use techniques from film animation to instruct the robot to move. This article is a systematic literature review of human-robot trials, pilots, and evaluations that have applied techniques from animation to move a robot. Through 27 articles, we find that animation techniques improves an individual’s interaction with robots, improving the individual’s perception of qualities of a robot, understanding what a robot intends to do, and showing the robot’s state or possible emotion. Animation techniques also help people relate to robots that do not resemble a human or robot. The studies in the articles show further areas for research, such as applying animation principles in other types of robots and situations, combining animation techniques with other modalities, and testing robots moving with animation techniques over the long term.

References

[1]
Thibault Asselborn, Wafa Johal, and Pierre Dillenbourg. 2017. Keep on moving! Exploring anthropomorphic effects of motion during idle moments. In Proceedings of the 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN’17). 897--902.
[2]
Assoctiation of Computing Machinists. 2018. ACM Digital Library. Retrieved from https://dl.acm.org/.
[3]
K. Baraka, S. Rosenthal, and M. Veloso. 2016. Enhancing human understanding of a mobile robot’s state and actions using expressive lights. In Proceedings of the 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN’16). 652--657.
[4]
Christoph Bartneck, Takayuki Kanda, Omar Mubin, and Abdullah Al Mahmud. 2009. Does the design of a robot influence its animacy and perceived intelligence? Int. J. Soc. Robotics 1, 2 (2009), 195--204.
[5]
Christoph Bartneck, Takayuki Kanda, Omar Mubin, and Abdullah Al Mahmud. 2007. The perception of animacy and intelligence based on a robot’s embodiment. In Proceedings of the 7th IEEE-RAS International Conference on Humanoid Robots. IEEE, 300--305.
[6]
Christoph Bartneck, Dana Kulić, Elizabeth Croft, and Susana Zoghbi. 2009. Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int. J. Soc. Robot. 1, 1 (2009), 71--81.
[7]
Christoph Bartneck, Marius Soucy, Kevin Fleuret, and Eduardo B. Sandoval. 2015. The robot engine—Making the unity 3D game engine work for HRI. In Proceedings of the 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN’15). 431--437.
[8]
Christoph Bartneck, Tomohiro Suzuki, Takayuki Kanda, and Tatsuya Nomura. 2006. The influence of people’s culture and prior experiences with aibo on their attitude toward robots. AI Society 21, 1--2 (2006), 217--230.
[9]
Christoph Bartneck, Michel van der Hoek, Omar Mubin, and Abdullah Al Mahmud. 2007. “Daisy, daisy, give me your answer do!” Switching off a robot. In Proceedings of the 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI’07). 217--222.
[10]
Christoph Bartneck, Marcel Verbunt, Omar Mubin, and Abdullah Al Mahmud. 2007. To kill a mockingbird robot. In Proceedings of the 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI’07). 81--87.
[11]
Anton Batliner, Christian Hacker, Stefan Steidl, Elmar Nöth, Shona D’Arcy, Martin J. Russell, and Michael Wong. 2004. “You stupid tin box—Children interacting with the AIBO robot: A cross-linguistic emotional speech corpus. In Proceedings of the 4th International Conference on Language Resources and Evaluation (LREC’04).
[12]
Aryel Beck, Brett Stevens, Kim A. Bard, and Lola Cañamero. 2012. Emotional body language displayed by artificial agents. ACM Trans. Interact. Intell. Syst. 2, 1, Article 2 (2012), 2:1--2:29.
[13]
Maren Bennewitz, Felix Faber, Dominik Joho, Michael Schreiber, and Sven Behnke. 2005. Toward a humanoid museum guide robot that interacts with multiple persons. In Proceedings of the 5th IEEE-RAS International Conference on Humanoid Robots, 2005. 418--423.
[14]
Tanya N. Beran, Alejandro Ramirez-Serrano, Roman Kuzyk, Meghann Fior, and Sarah Nugent. 2011. Understanding how children understand robots: Perceived animism in child-robot interaction. Int. J. Hum.-Comput. Stud. 69, 7--8 (2011), 539--550.
[15]
S.-J. Blakemore, P. Boyer, M. Pachot-Clouard, A. Meltzoff, C. Segebarth, and J. Decety. 2003. The detection of contingency and animacy from simple animations in the human brain. Cereb. Cortex 13, 8 (2003), 837--844.
[16]
Margaret M. Bradley and Peter J. Lang. 1994. Measuring emotion: The self-assessment manikin and the semantic differential. J. Behav. Ther. Exper. Psych. 25, 1 (1994), 49--59.
[17]
Cynthia Breazeal. 2003. Emotion and sociable humanoid robots. Int. J. Hum.-Comput. Stud. 59, 1--2 (2003), 119--155.
[18]
Cynthia Breazeal and B. Scassellati. 1999. How to build robots that make friends and influence people. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients, vol. 2. 858--863.
[19]
David Budgen and Pearl Brereton. 2006. Performing systematic literature reviews in software engineering. In Proceedings of the 28th International Conference on Software Engineering (ICSE’06). ACM, New York, NY, 1051--1052.
[20]
Bay-Wei Chang and David Ungar. 1993. Animation: From cartoons to the user interface. In Proceedings of the 6th Annual ACM Symposium on User Interface Software and Technology (UIST’93). ACM, New York, NY, 45--55.
[21]
Lukáš Danev, Marten Hamann, Nicolas Fricke, Tobias Hollarek, and Dennys Paillacho. 2017. Development of animated facial expressions to express emotions in a robot: RobotIcon. In Proceedings of the IEEE Second Ecuador Technical Chapters Meeting (ETCM’17). 1--6.
[22]
Kerstin Dautenhahn. 2018. Some brief thoughts on the past and future of human-robot interaction. ACM Trans. Hum.-Robot Interact. 7, 1 (2018), 4:1--4:3.
[23]
Jessica Q. Dawson, Oliver S. Schneider, Joel Ferstay, Dereck Toker, Juliette Link, Shathel Haddad, and Karon MacLean. 2013. It’s alive!: Exploring the design space of a gesturing phone. In Proceedings of the Graphics Interface Conference (GI’13). Canadian Information Processing Society, Regina, Sascatchewan, Canada, 205--212. Retrieved from http://dl.acm.org/citation.cfm?id=2532129.2532164.
[24]
Frédéric Delaunay, Joachim de Greeff, and Tony Belpaeme. 2010. A study of a retro-projected robotic face and its effectiveness for gaze reading by humans. In Proceedings of the 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI’10). IEEE Press, Piscataway, NJ, 39--44. Retrieved from http://dl.acm.org/citation.cfm?id=1734454.1734471.
[25]
Anca D. Dragan, Shira Bauman, Jodi Forlizzi, and Siddhartha S. Srinivasa. 2015. Effects of robot motion on human-robot collaboration. In Proceedings of the 10th Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI’15). ACM, New York, NY, 51--58.
[26]
Anca D. Dragan, Kenton C. T. Lee, and Siddhartha S. Srinivasa. 2013. Legibility and predictability of robot motion. In Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI’13). IEEE Press, Piscataway, NJ, 301--308. Retrieved from http://dl.acm.org/citation.cfm?id=2447556.2447672.
[27]
Anca Dragan and Siddhartha Srinivasa. 2014. Familiarization to robot motion. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI’14). ACM, New York, NY, 366--373.
[28]
Brittany A. Duncan and Robin R. Murphy. 2017. Effects of speed, cyclicity, and dimensionality on distancing, time, and preference in human-aerial vehicle interactions. ACM Trans. Interact. Intell. Syst. 7, 3 (2017), 13:1--13:27.
[29]
Paul Ekman. 1999. Basic emotions. In Handbook of Cognition and Emotion. John Wiley 8 Sons, Sussex, UK, 45--60.
[30]
Jodi Forlizzi and Carl DiSalvo. 2006. Service robots in the domestic environment: A study of the Roomba vacuum in the home. In Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction (HRI’06). ACM, New York, NY, 258--265.
[31]
Michael J. Gielniak and Andrea L. Thomaz. 2012. Enhancing interaction through exaggerated motion synthesis. In Proceedings of the 7th Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI’12). ACM, New York, NY, 375--382.
[32]
John Harris and Ehud Sharlin. 2010. Exploring emotive actuation and its role in human-robot interaction. In Proceedings of the 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI’10). IEEE Press, Piscataway, NJ, 95--96. Retrieved from http://dl.acm.org/citation.cfm?id=1734454.1734489.
[33]
Fritz Heider and Marianne Simmel. 1944. An experimental study of apparent behavior. Amer. J. Psychol. 57, 2 (1944), 243--259.
[34]
Guy Hoffman, Jodi Forlizzi, Shahar Ayal, Aaron Steinfeld, John Antanitis, Guy Hochman, Eric Hochendoner, and Justin Finkenaur. 2015. Robot presence and human honesty: Experimental evidence. In Proceedings of the 10th Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI’15). ACM, New York, NY, 181--188.
[35]
Guy Hoffman and Wendy Ju. 2014. Designing robots with movement in mind. J. Hum.-Robot Interact. 3, 1 (2014), 89--122.
[36]
Guy Hoffman, Rony Kubat, and Cynthia Breazeal. 2008. A hybrid control system for puppeteering a live robotic stage actor. In Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN’08). 354--359.
[37]
Guy Hoffman and Gil Weinberg. 2011. Interactive improvisation with a robotic marimba player. Auton. Robots 31, 2--3 (2011), 133--153. cited By 18.
[38]
Aike C. Horstmann, Nikolai Bock, Eva Linhuber, Jessica M. Szczuka, Carolin Straßmann, and Nicole C. Krämer. 2018. Do a robot’s social skills and its objection discourage interactants from switching the robot off? PLoS ONE 13, 7 (2018), e0201581.
[39]
Scott E. Hudson and John T. Stasko. 1993. Animation support in a user interface toolkit: Flexible, robust, and reusable abstractions. In Proceedings of the 6th Annual ACM Symposium on User Interface Software and Technology (UIST’93). ACM, New York, NY, 57--67.
[40]
Institute of Electrical and Electronics Engineers. 2018. IEEE Xplore Digital Library. Retrieved from https://ieeexplore.ieee.org/Xplore/home.jsp.
[41]
Takamune Izui and Gentiane Venture. 2017. Impression’s predictive models for animated robot. In Proceedings of the 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN’17). 621--626.
[42]
Malte Jung and Pamela Hinds. 2018. Robots in the wild: A time for more robust theories of human-robot interaction. ACM Trans. Hum.-Robot Interact. 7, 1 (2018), 2:1--2:5.
[43]
Hiroko Kamide, Yasushi Mae, Koji Kawabe, Satoshi Shigemi, Masato Hirose, and Tatsuo Arai. 2012. New measurement of psychological safety for humanoid. In Proceedings of the 7th Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI’12). ACM, New York, NY, 49--56.
[44]
Heather Knight and Reid Simmons. 2014. Expressive motion with x, y, and θ: Laban effort features for mobile robots. In Proceedings of the 23rd IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN’16). 267--273.
[45]
Heather Knight and Reid Simmons. 2016. Laban head-motions convey robot state: A call for robot body language. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA’16). 2881--2888.
[46]
Heather Knight and Reid Simmons. 2015. Layering laban effort features on robot task motions. In Proceedings of the 10th Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts (HRI’15). ACM, New York, NY, 135--136.
[47]
Heather Knight, Manuela Veloso, and Reid Simmons. 2015. Taking candy from a robot: Speed features and candy accessibility predict human response. In Proceedings of the 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN’15). 355--362.
[48]
Rudolf Laban. 1948. Modern Educational Dance. MacDonald 8 Evans, London.
[49]
John Lasseter. 1987. Principles of traditional animation applied to 3D computer animation. In Proceedings of the 14th Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH’87). ACM, New York, NY, 35--44.
[50]
Amy LaViers and Magnus Egerstedt. 2012. Style-based robotic motion. In Proceedings of the American Control Conference (ACC’12). 4327--4332.
[51]
Amy LaViers, Lori Teague, and Magnus Egerstedt. 2014. Style-based robotic motion in contemporary dance performance. In Controls and Art. Springer, Cham, 205--229.
[52]
Jamy Li, René Kizilcec, Jeremy Bailenson, and Wendy Ju. 2016. Social robots and virtual agents as lecturers for video instruction. Comput. Hum. Behav. 55 (2016), 1222--1230, PB.
[53]
Ricardo Loureiro, Andre Lopes, Carlos Carona, Daniel Almeida, Fernanda Faria, Luis Garrote, Cristiano Premebida, and Urbano Nunes. 2017. ISR-RobotHead: Robotic head with LCD-based emotional expressiveness. In Proceedings of the IEEE 5th Portuguese Meeting on Bioengineering (ENBENG’17). IEEE, 4.
[54]
Michal Luria, Guy Hoffman, Benny Megidish, Oren Zuckerman, and Sung Park. 2016. Designing Vyo, a robotic smart home assistant: Bridging the gap between device and social agent. In Proceedings of the 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN’16). 1019--1025.
[55]
Courgeon Matthieu and Duhaut Dominique. 2015. Artificial companions as personal coach for children: The interactive drums teacher. In Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology (ACE’15). Article 16. ACM, 16:1--16:4.
[56]
Gail F. Melson, Peter H. Kahn, Jr., Alan M. Beck, Batya Friedman, Trace Roberts, and Erik Garrett. 2005. Robots as dogs?: Children’s interactions with the robotic dog AIBO and a live australian shepherd. In Proceedings of the Conference on Human Factors in Computing Systems—Extended Abstracts (CHI EA’05). ACM, 1649--1652.
[57]
Albert Michotte. 1963. The Perception of Causality. Basic Books.
[58]
Nicole Mirnig, Susanne Stadler, Gerald Stollnberger, Manuel Giuliani, and Manfred Tscheligi. 2016. Robot humor: How self-irony and schadenfreude influence people’s rating of robot likability. In Proceedings of the 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN’16). 166--171.
[59]
Masahiro Mori, Karl F. MacDorman, and Norri Kageki. 2012. The uncanny valley {from the field}. IEEE Robot. Automat. Mag. 19, 2 (2012), 98--100.
[60]
V. Nitsch and T. Glassen. 2015. Investigating the effects of robot behavior and attitude toward technology on social human-robot interactions. In Proceedings of the 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN’15). IEEE, 535--540.
[61]
Sandra Y. Okita, Daniel L. Schwartz, Takanori Shibata, and Hideyuki Tokuda. 2005. Exploring young children’s attributions through entertainment robots. In Proceedings of the IEEE International Workshop on Robot and Human Interactive Communication (RO-MAN’05). 390--395.
[62]
Jeong Woo Park, Hui Sung Lee, and Myung Jin Chung. 2015. Generation of realistic robot facial expressions for human robot interaction. J. Intell. Robot. Syst. 78, 3--4 (2015), 443--462.
[63]
Jean Piaget. 1929. The Child’s Conception of the World. International Library of Psychology, Philosophy and Scientific Method. Kegan Paul, Trench, Trubner 8 Co., London.
[64]
Jonathan Posner, James A. Russell, and Bradley S. Peterson. 2005. The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology. Develop. Psychopathol. 17, 3 (2005), 715--734.
[65]
E. Pot, J. Monceaux, R. Gelin, and B. Maisonnier. 2009. Choregraphe: A graphical tool for humanoid robot programming. In Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN’09). 46--51.
[66]
Daniel Rakita, Bilge Mutlu, and Michael Gleicher. 2017. A motion retargeting method for effective mimicry-based teleoperation of robot arms. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction (HRI’17). ACM, New York, NY, 361--370.
[67]
Tiago Ribeiro and Ana Paiva. 2017. Animating the adelino robot with ERIK: The expressive robotics inverse kinematics. In Proceedings of the 19th ACM International Conference on Multimodal Interaction (ICMI’17). ACM, New York, NY, 388--396.
[68]
Tiago Ribeiro and Ana Paiva. 2012. The illusion of robotic life: Principles and practices of animation for robots. In Proceedings of the 7th Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI’12). ACM, 383--390.
[69]
Tiago Ribeiro, Ana Paiva, and Doug Dooley. 2013. Nutty tracks: Symbolic animation pipeline for expressive robotics. In Proceedings of the ACM Special Interest Group on Computer Graphics and Interactive Techniques (SIGGRAPH’13). ACM, New York, NY, 8:1--8:1.
[70]
Laurel D. Riek. 2012. Wizard of oz studies in HRI: A systematic review and new reporting guidelines. J. Hum.-Robot Interact. 1, 1 (2012).
[71]
Ruthger Righart and Beatrice de Gelder. 2006. Context influences early perceptual analysis of faces—An electrophysiological study. Cereb. Cortex 16, 9 (2006), 1249--1257.
[72]
David Robert and Cynthia Breazeal. 2012. Blended reality characters. In Proceedings of the 7th Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI’12). ACM, New York, NY, 359--366.
[73]
Martin Saerbeck and Christoph Bartneck. 2010. Perception of affect elicited by robot motion. In Proceedings of the 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI’10). IEEE Press, Piscataway, NJ, 53--60. Retrieved from http://dl.acm.org/citation.cfm?id=1734454.1734473.
[74]
Jelle Saldien, Kristof Goris, Bram Vanderborght, Johan Vanderfaeillie, and Dirk Lefeber. 2010. Expressing emotions with the social robot probo. Int. J. Soc. Robot. 2, 4 (2010), 377--389.
[75]
Derek Carl Scherer. 2014. Movie magic makes better social robots: The overlap of special effects and character robot engineering. J. Hum.-Robot Interact. 3, 1 (2014), 123--141.
[76]
Brian J. Scholl and Patrice D. Tremoulet. 2000. Perceptual causality and animacy. Trends Cogn. Sci. 4, 8 (2000), 299--309.
[77]
Megha Sharma, Dale Hildebrandt, Gem Newman, James E. Young, and Rasit Eskicioglu. 2013. Communicating affect via flight path: Exploring use of the laban effort system for designing affective locomotion paths. In Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI’13). IEEE Press, Piscataway, NJ, 293--300. http://dl.acm.org/citation.cfm?id=2447556.2447671
[78]
Stefan Sosnowski, Kolja Kuhnlenz, and Martin Buss. 2006. EDDIE—An emotion-display with dynamic intuitive expressions. In Proceedings of the 15th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN’06). 569--574.
[79]
Ja-Young Sung, Lan Guo, Rebecca E. Grinter, and Henrik I. Christensen. 2007. “My Roomba is Rambo”: Intimate home appliances. In Proceedings of the 9th International Conference on Ubiquitous Computing (UbiComp’07). Springer-Verlag, Berlin, 145--162. Retrieved from http://dl.acm.org/citation.cfm?id=1771592.1771601.
[80]
Daniel Szafir, Bilge Mutlu, and Terrence Fong. 2014. Communication of intent in assistive free flyers. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI’14). ACM, New York, NY, 358--365.
[81]
Daniel Szafir, Bilge Mutlu, and Terry Fong. 2015. Communicating directionality in flying robots. In Proceedings of the 10th Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI’15). ACM, New York, NY, 19--26.
[82]
Leila Takayama, Doug Dooley, and Wendy Ju. 2011. Expressing thought: Improving robot readability with animation principles. In Proceedings of the 6th International Conference on Human-Robot Interaction (HRI’11). ACM, New York, NY, 69--76.
[83]
Takayuki, Kanda and Hiroshi Ishiguro. 2012. Human-Robot Interaction in Social Robotics. Taylor and Francis, Hoboken, NJ. 367.
[84]
Zane Thimmesch-Gill, Kathleen A. Harder, and Wilma Koutstaal. 2017. Perceiving emotions in robot body language: Acute stress heightens sensitivity to negativity while attenuating sensitivity to arousal. Comput. Hum. Behav. 76 (2017), 59--67.
[85]
Frank Thomas and Ollie Johnston. 1995. The Illusion of Life: Disney Animation (1st Hyperion ed.). Hyperion, New York. 575
[86]
J. Gregory Trafton, Magdalena D. Bugajska, Benjamin R. Fransen, and Raj M. Ratwani. 2008. Integrating vision and audition within a cognitive architecture to track conversations. In Proceedings of the 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI’08). 201--208.
[87]
Patrice D. Tremoulet and Jacob Feldman. 2000. Perception of animacy from the motion of a single object. Perception 29, 8 (2000), 943--951.
[88]
A. J. N. van Breemen. 2004. Animation engine for believable interactive user-interface robots. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’04), Vol. 3. 2873--2878.
[89]
A. J. N. van Breemen. 2004. Bringing robots to life: Applying principles of animation to robots. In Proceedings of the Shaping Human-Robot Interaction Workshop Held at the Conference on Human Factors in Computing Systems (CHI’04).
[90]
A. J. N. van Breemen and Yan Xue. 2006. Advanced animation engine for user-interface robots. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems. 1824--1830.
[91]
Albert van Breemen, Xue Yan, and Bernt Meerbeek. 2005. iCat: An animated user-interface robot with personality. In Proceedings of the 4th International Joint Conference on Autonomous Agents and Multiagent Systems (AAMAS’05). ACM, New York, NY, 143--144.
[92]
Astrid Weiss and Christoph Bartneck. 2015. Meta analysis of the usage of the godspeed questionnaire series. In Proceedings of the 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN’15). IEEE, 381--388.
[93]
Ryan Wistort and Cynthia Breazeal. 2011. TofuDraw: A mixed-reality choreography tool for authoring robot character performance. In Proceedings of the 10th International Conference on Interaction Design and Children (IDC’11). ACM, Ann Arbor, MI, 213--216.
[94]
Fumitaka Yamaoka, Takayuki Kanda, Hiroshi Ishiguro, and Norihiro Hagita. 2005. “Lifelike” behavior of communication robots based on developmental psychology findings. In Proceedings of the 5th IEEE-RAS International Conference on Humanoid Robots. IEEE, 406--411.
[95]
Fumitaka Yamaoka, Takayuki Kanda, Hiroshi Ishiguro, and Norihiro Hagita. 2006. The relationship between contingency and complexity in a lifelike humanoid robot. In Proceedings of the 6th IEEE-RAS International Conference on Humanoid Robots. 382--389.
[96]
Selma Yilmazyildiz, Werner Verhelst, and Hichem Sahli. 2015. Gibberish speech as a tool for the study of affective expressiveness for robotic agents. Multimedia Tools Appl. 74, 22 (2015), 9959--9982.
[97]
Steve Yohanan and Karon E. MacLean. 2011. Design and assessment of the haptic creature’s affect display. In Proceedings of the 6th International Conference on Human-Robot Interaction (HRI’11). ACM, New York, NY, 473--480.
[98]
James E. Young, Takeo Igarashi, Ehud Sharlin, Daisuke Sakamoto, and Jeffrey Allen. 2014. Design and evaluation techniques for authoring interactive and stylistic behaviors. ACM Trans. Interact. Intell. Syst. 3, 4 (2014), 23:1--23:36.
[99]
James E. Young, Min Xin, and Ehud Sharlin. 2007. Robot expressionism through cartooning. In Proceedings of the 2nd Annual Conference on Human-Robot Interaction (HRI’07). IEEE, 309--316.
[100]
Allan Zhou, Dylan Hadfield-Menell, Anusha Nagabandi, and Anca D. Dragan. 2017. Expressive robot motion timing. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI’17). ACM, New York, NY, 22--31.

Cited By

View all
  • (2024)Artificial Intelligence-Assisted Simulation Research on Intelligent Behavior of Film and Television 3D Animation CharactersApplied Mathematics and Nonlinear Sciences10.2478/amns-2024-32919:1Online publication date: 18-Nov-2024
  • (2024)Integration effect of artificial intelligence and traditional animation creation technologyJournal of Intelligent Systems10.1515/jisys-2023-030533:1Online publication date: 31-May-2024
  • (2024)Enacting Human–Robot Encounters with Theater Professionals on a Mixed Reality StageACM Transactions on Human-Robot Interaction10.1145/367818614:1(1-25)Online publication date: 28-Nov-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Human-Robot Interaction
ACM Transactions on Human-Robot Interaction  Volume 8, Issue 2
June 2019
136 pages
EISSN:2573-9522
DOI:10.1145/3339062
Issue’s Table of Contents
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 03 June 2019
Accepted: 01 February 2019
Revised: 01 January 2019
Received: 01 October 2017
Published in THRI Volume 8, Issue 2

Check for updates

Author Tags

  1. Robot
  2. animation
  3. human-robot interaction
  4. literature review
  5. motion

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

  • Norges Forskningsråd

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)847
  • Downloads (Last 6 weeks)99
Reflects downloads up to 20 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Artificial Intelligence-Assisted Simulation Research on Intelligent Behavior of Film and Television 3D Animation CharactersApplied Mathematics and Nonlinear Sciences10.2478/amns-2024-32919:1Online publication date: 18-Nov-2024
  • (2024)Integration effect of artificial intelligence and traditional animation creation technologyJournal of Intelligent Systems10.1515/jisys-2023-030533:1Online publication date: 31-May-2024
  • (2024)Enacting Human–Robot Encounters with Theater Professionals on a Mixed Reality StageACM Transactions on Human-Robot Interaction10.1145/367818614:1(1-25)Online publication date: 28-Nov-2024
  • (2024)From Cage to Stage: Mapping Human Dance Movements Onto Industrial Robotic Arm MotionProceedings of the 9th International Conference on Movement and Computing10.1145/3658852.3659090(1-6)Online publication date: 30-May-2024
  • (2024)Tangible Scenography as a Holistic Design Method for Human-Robot InteractionProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661530(459-475)Online publication date: 1-Jul-2024
  • (2024)Programming Robot Animation Through Human Body Movement2024 IEEE International Conference on Advanced Robotics and Its Social Impacts (ARSO)10.1109/ARSO60199.2024.10557907(273-279)Online publication date: 20-May-2024
  • (2024)Multi-Point Mapping of Dancer Aesthetic Movements Onto a Robotic ArmIEEE Access10.1109/ACCESS.2024.350495412(177723-177734)Online publication date: 2024
  • (2024)Education in cinematography and VR- technologies: the impact of animation on the film perceptionInteractive Learning Environments10.1080/10494820.2024.2372831(1-15)Online publication date: 10-Jul-2024
  • (2024)A general approach for generating artificial human-like motions from functional components of human upper limb movementsControl Engineering Practice10.1016/j.conengprac.2024.105968148(105968)Online publication date: Jul-2024
  • (2024)Design and Evaluation of a Mobile Robotic Assistant for Emotional Learning in Individuals with ASD: Expert Evaluation StageInternational Journal of Social Robotics10.1007/s12369-024-01145-x16:8(1765-1781)Online publication date: 3-Jun-2024
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Full Access

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media