Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article
Open access

Designing robots with movement in mind

Published: 28 February 2014 Publication History

Abstract

This paper makes the case for designing interactive robots with their expressive movement in mind. As people are highly sensitive to physical movement and spatiotemporal affordances, well-designed robot motion can communicate, engage, and offer dynamic possibilities beyond the machines' surface appearance or pragmatic motion paths. We present techniques for movement centric design, including character animation sketches, video prototyping, interactive movement explorations, Wizard of Oz studies, and skeletal prototypes. To illustrate our design approach, we discuss four case studies: a social head for a robotic musician, a robotic speaker dock listening companion, a desktop telepresence robot, and a service robot performing assistive and communicative tasks. We then relate our approach to the design of non-anthropomorphic robots and robotic objects, a design strategy that could facilitate the feasibility of real-world human-robot interaction.

References

[1]
Adalgeirsson, S. O., & Breazeal, C. (2010). MeBot: A robotic platform for socially embodied telepresence. In Proceedings of the 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp. 15--22), Osaka, Japan.
[2]
Akers, D. (2006). Wizard of Oz for participatory design: Inventing a gestural interface for 3D selection of neural pathway estimates. In Proceedings of CHI '06: Extended Abstracts on Human Factors in Computing Systems (pp. 454--459), Montreal, Canada.
[3]
Alami, R., Clodic, A., Montreuil, V., Sisbot, E. A., & Chatila, R. (2005). Task planning for human-robot interaction. In Proceedings of the 2005 joint conference on smart objects and ambient intelligence (SOC EUSAI) (pp. 81--85). New York, NY: ACM Press.
[4]
Argyle, M. (1988). Bodily Communication (2nd ed.). Methuen & Co, UK.
[5]
Aucouturier, J., Ogai, Y., & Ikegami, T. (2008). Making a robot dance to music using chaotic itinerancy in a network of Fitzhugh-Nagumo neurons. Neural Information Processing, 4985, pp. 647--656.
[6]
Baldwin, D. A., & Baird, J. A. (2001). Discerning intentions in dynamic human action. Trends in Cognitive Sciences, 5(4), 171--178.
[7]
Baron-Cohen, S. (1991). Precursors to a theory of mind: Understanding attention in others. In A. Whiten (Ed.), Natural theories of mind (pp. 233--250). Oxford, UK: Blackwell Press.
[8]
Barrett, H. C., Todd, P. M., Miller, G. F., & Blythe, P. W. (2005). Accurate judgments of intention from motion cues alone: A cross-cultural study. Evolution and Human Behavior, 26(4), 313--331.
[9]
Bates, J. (1994). The role of emotion in believable agents. Communications of the ACM, 37(7), 122--125.
[10]
Blythe, P. W., Todd, P. M., & Miller, G. F. (1999). How motion reveals intention: Categorizing social interactions. In Gigerenzer, G., and Todd, P. M. (eds), Simple Heuristics that Make Us Smart. Oxford University Press, New York.
[11]
Bohren, J., Rusu, R. B., Jones, E. G., Marder-Eppstein, E., Pantofaru, C., Wise, M., .., Holzer, S.(2011). Towards autonomous robotic butlers: Lessons learned with the PR2. In Proceedings from the IEEE International Conference on Robotics and Automation (ICRA) (pp. 5568--5575).
[12]
Breazeal, C., Brooks, A., Chilongo, D., Gray, J., Hoffman, G., Kidd, C., ... Lockerd, A. (2004). Working collaboratively with Humanoid Robots. In Proceedings of the IEEE RAS/RSJ International Conference on Humanoid Robots (Humanoids). Santa Monica, CA.
[13]
Breazeal, C., Wang, A., & Picard, R. (2007). Experiments with a robotic computer: Body, affect and cognition interactions. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (pp. 153--160). New York, NY: ACM.
[14]
Bretan, M., Cicconet, M., Nikolaidis, R., & Weinberg, G. (2012). Developing and Composing for a Robotic Musician Using Different Modes of Interaction. In Proceedings of the International Computer Music Conference (ICMC) (pp. 498--503), Ljubljana, Slovenia.
[15]
Chambers, J. (2011). Artificial Defense Mechanisms P. Antonelli (Ed.). New York, NY: The Museum of Modern Art.
[16]
Clark, H. H. (1996). Using Language. Cambridge, UK: Cambridge University Press.
[17]
Clark, H. H. (2005). Coordinating with each other in a material world. Discourse Studies, 7(4--5), 507--525. Dennett, D. C. (1987). Three kinds of intentional psychology. In The intentional Stance (chap. 3). Cambridge, MA: MIT Press.
[18]
DiSalvo, C. F., Gemperle, F., Forlizzi, J., & Kiesler, S. (2002). All robots are not created equal: The Design and Perception of Humanoid Robot Heads. In Proceedings of the 4th Conference on Designing Interactive Systems (DIS) (pp. 321--326). New York, NY: ACM Press.
[19]
Duffy, B. R. (2003). Anthropomorphism and the social robot. Robotics and Autonomous Systems, 42(3--4), 177--190.
[20]
Ekman, P., & Friesen, W. (1969). The repertoire of nonverbal behavior: Categories, origins, usage, and coding. Semiotica, 1(1), 49--98.
[21]
Fink, J. (2012). Anthropomorphism and human likeness in the design of robots and human-robot interaction. Social Robotics, 199--208.
[22]
Fiske, S., & Taylor, S. (1991). Social cognition. New York, NY: McGraw-Hill.
[23]
Gao, T., Newman, G. E., & Scholl, B. J. (2009). The psychophysics of chasing : A case study in the perception of animacy. Cognitive Psychology, 59(2), 154--179.
[24]
Gibson, J. J. (1977). The concept of affordances. Perceiving, acting, and knowing, 67--82, Lawrence Erlbaum Associates, New Jersey.
[25]
Goldberg, K., & Kehoe, B. (2013, January). Cloud Robotics and Automation: A Survey of Related Work (Tech. Rep. No. UCB/EECS-2013-5). EECS Department, University of California, Berkeley.
[26]
Gray, J., Hoffman, G., Adalgeirsson, S. O., Berlin, M., & Breazeal, C. (2010). Expressive, interactive robots: Tools, techniques, and insights based on collaborations. In HRI 2010 Workshop: What Do Collaborations With the Arts Have to Say About HRI?
[27]
Hall, E. T. (1969). The hidden dimension. New York, NY: Anchor Books.
[28]
Heider, F., & Simmel, M. (1944). An experimental study of apparent behavior. The American Journal of Psychology, 57(2), 243--259.
[29]
Hendriks, B., Meerbeek, B., Boess, S., Pauws, S., & Sonneveld, M. (2011). Robot vacuum cleaner personality and behavior. International Journal of Social Robotics, 3(2), 187--195.
[30]
Hoffman, G. (2005). HRI: Four Lessons from Acting Method (Tech. Rep.). Cambridge, MA, USA: MIT Media Laboratory.
[31]
Hoffman, G. (2012). Dumb robots, smart phones: A case study of music listening companionship. In Proceedings of the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Atlanta, Georgia.
[32]
Hoffman, G., & Breazeal, C. (2009). Effects of anticipatory perceptual simulation on practiced human-robot tasks. Autonomous Robots, 28(4), 403--423.
[33]
Hoffman, G., Kubat, R. R., & Breazeal, C. (2008). A hybrid control system for puppeterring a live robotic stage actor. In Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Munich, Germany.
[34]
Hoffman, G., & Vanunu, K. (2013). Effects of robotic companionship on music enjoyment and agent perception. In Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[35]
Hoffman, G., & Weinberg, G. (2011). Interactive improvisation with a robotic marimba player. Autonomous Robots, 31(2--3), 133--153.
[36]
Höysniemi, J., Hämäläinen, P., & Turkki, L. (2004). Wizard of Oz prototyping of computer vision based action games for children. In Proceedings of the 2004 Conference on Interaction Design and Children: Building a Community (pp. 27--34), College Park, Maryland.
[37]
Hudson, S., Fogarty, J., Atkeson, C., Avrahami, D., Forlizzi, J., Kiesler, S., ... Yang, J. (2003). Predicting human interruptibility with sensors: a Wizard of Oz feasibility study. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 257--264), Ft. Lauderdale, Florida.
[38]
Inhatowicz, E. (1970). Cybernetic Art. Available from http://www.senster.com/ihnatowicz/senster/. Johansson, G. (1973). Visual perception of biological motion and a model for its analysis. Perception & Psychophysics, 14(2), 201--211.
[39]
Jones, C. (1965). The Dot and the Line. Available at http://vimeo.com/71888010
[40]
Ju, W., & Takayama, L. (2009). Approachability: How people interpret automatic door movement as gesture. International Journal of Design, 3(2), 1--10.
[41]
Kelley, J. F. (1983). An empirical methodology for writing user-friendly natural language computer applications. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 193--196), Boston, Massachuesetts.
[42]
Kim, J., Kwak, S., & Kim, M. (2009). Entertainment robot personality design based on basic factors on motions: A case study with Rolly. In Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication, (RO-MAN) (pp. 803--808).
[43]
Kirsh, D., & Maglio, P. (1994). On distinguishing epistemic from pragmatic action. Cognitive Science, 18(4), 513--549.
[44]
Knapp, M. L., & Hall, J. A. (2002). Nonverbal communication in human interaction (5th ed.). Fort Worth: Harcourt Brace College Publishers.
[45]
Knight, H. (2011). Eight lessons learned about non-verbal interactions through robot theater. In Social robotics (pp. 42--51). Springer.
[46]
Kozima, H., Michalowski, M. P., & Nakagawa, C. (2009). Keepon: A playful robot for research, therapy, and entertainment. International Journal of Social Robotics, 1(1), 3--18.
[47]
Kozlowski, L. T., & Cutting, J. E. (1977). Recognizing the sex of a walker from a dynamic point-light display. Perception & Psychophysics, 21(6), 575--580.
[48]
Kraut, R. E. (1978). Verbal and nonverbal cues in the perception of lying. Journal of Personality and Social Psychology, 36(4), 380.
[49]
Lasseter, J. (1986). Luxo Jr. Pixar Animation Studios. Pixar. Available from http://www.pixar.com/short_films/Theatrical-Shorts/Luxo-Jr.
[50]
Lasseter, J. (1987). Principles of traditional animation applied to 3D computer animation. Computer Graphics, 21(4), 35--44.
[51]
Lasseter, J. (2001). Tricks to animating characters with a computer. ACM SIGGRAPH Computer Graphics, 35(2), 45--47.
[52]
Loula, F., Prasad, S., Harber, K., & Shiffrar, M. (2005). Recognizing people from their movement. Journal of Experimental Psychology: Human Perception and Performance, 31(1), 210--20.
[53]
Malle, B., Moses, L., & Baldwin, D. (Eds.). (2001). Intentions and Intentionality. MIT Press, Cambridge, Massachusetts.
[54]
Maulsby, D., Greenberg, S., & Mander, R. (1993). Prototyping an intelligent agent through Wizard of Oz. In Proceedings of the Conference on Human Factors in Computing Systems (INTERACT '93 & CHI '93) (pp. 277--284), Amsterdam, The Netherlands.
[55]
Meerbeek, B., Saerbeck, M., & Bartneck, C. (2009). Towards a design method for expressive robots. In Proceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI), (pp. 277--278), San Diego, California.
[56]
Michalowski, M., Sabanovic, S., & Kozima, H. (2007). A Dancing Robot for Rhythmic Social Interaction. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (pp. 89--96). Arlington, VA.
[57]
Michotte, A. (1946). La perception de la causalité (Etudes Psychology, Vol. VI.).
[58]
Moore, N.-J., Hickson, M., & Stacks, D. W. (2010). Nonverbal communication: Studies and applications. London: Oxford University Press.
[59]
Mori, M. (1970). The uncanny valley. Energy, 7(4), 33--35.
[60]
Mumm, J., & Mutlu, B. (2011). Human-robot proxemics: Physical and psychological distancing in human-robot interaction. In Proceedings of the 6th International Conference on Human-Robot Interaction (HRI)(p. 331). New York, NY: ACM Press.
[61]
Murphy, R., Shell, D., Guerin, A., Duncan, B., Fine, B., Pratt, K., & Sourntos, T. (2010). A Midsummer Nights Dream (With Flying Robots). Autonomous Robots, 30(2), 143--156.
[62]
Nomura, T., Suzuki, T., Kanda, T., Han, J., Shin, N., Burke, J., & Kato, K. (2008). What people assume about humanoid and animal-type robots: Cross-cultural analysis between Japan, Korea, and the United States. International Journal of Humanoid Robotics, 5(1), 25--46.
[63]
Norman, D. A. (1999). Affordance, conventions, and design. Interactions, 6(3), 38--43.
[64]
Paulos, E., & Canny, J. (1998). PRoP: Personal roving presence. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 296--303), Los Angeles, California.
[65]
Riek, L. D. (2012). Wizard of Oz studies in HRI: A systematic review and new reporting guidelines. Journal of Human-Robot Interaction, 1(1).
[66]
Santos, K. B. (2012). The Huggable: A socially assistive robot for pediatric care. Unpublished doctoral dissertation, Massachusetts Institute of Technology.
[67]
Scholl, B., & Tremoulet, P. (2000). Perceptual causality and animacy. Trends in Cognitive Sciences, 4(8), 299--309.
[68]
Setapen, A. (2011). Shared Attention for Human-Robot Interaction. Unpublished doctoral dissertation, Massachusetts Institute of Technology.
[69]
Sharma, M., Hildebrandt, D., Newman, G., Young, J. E., & Eskicioglu, R. (2013). Communicating affect via flight path Exploring use of the Laban Effort System for designing affective locomotion paths. In Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp. 293--300)
[70]
Sirkin, D., & Ju, W. (2012). Consistency in physical and on-screen action improves perceptions of telepresence robots. Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[71]
Takayama, L., Dooley, D., & Ju, W. (2011). Expressing thought: Improving robot readability with animation principles. In Proceedings of the 6th International Conference on Human-Robot Interaction (HRI) (pp. 69--76). ACM Press
[72]
Thomas, F., & Johnston, O. (1995). The Illusion of Life: Disney Animation (revised ed.). New York: Hyperion.
[73]
Thornton, I. M., Pinto, J., & Shiffrar, M. (1998). The visual perception of human locomotion. Cognitive Neuropsychology, 15, 535--552.
[74]
Tomasello, M. (1999). The cultural ecology of young childrens interactions with objects and artifacts. Ecological Approaches to Cognition: Essays in Honor of Ulric Neisser, 153--170.
[75]
Venolia, G., Tang, J., Cervantes, R., Bly, S., Robertson, G., Lee, B., & Inkpen, K. (2010). Embodied social proxy: Mediating interpersonal connection in hub-and-satellite teams. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 1049--1058)
[76]
Vertelney, L. (1995). Using video to prototype user interfaces. In Human-Computer Interaction (pp. 142--146).
[77]
Weinberg, G., & Driscoll, S. (2007). The design of a perceptual and improvisational robotic marimba player. In Proceedings from the 16th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (pp. 769--774). Jeju, Korea: IEEE
[78]
Wistort, R., & Breazeal, C. (2011). TofuDraw: A mixed-reality choreography tool for authoring robot character performance. In Proceedings from the IDC Conference Idc (pp. 213--216)
[79]
Woods, S. N., Walters, M. L., Koay, K. L., & Dautenhahn, K. (2006). Methodological issues in HRI: A comparison of live and video-based methods in robot to human approach direction trials. In Robot and human interactive communication,. In Proceedings from the 15th IEEE International Symposium on Robot and Human Communication (RO-MAN) (pp. 51--58).
[80]
Yankelovich, N., Simpson, N., Kaplan, J., & Provino, J. (2007). Porta-person: Telepresence for the connected conference room. In CHI'07: Extended Abstracts on Human Factors in Computing Systems (pp. 2789--2794)

Cited By

View all
  • (2024)Crafting for Emotion Appropriateness in Affective Robotics: Examining the Practicality of the OCC ModelProceedings of the ACM on Human-Computer Interaction10.1145/36764938:MHCI(1-19)Online publication date: 24-Sep-2024
  • (2024)Tangible Scenography as a Holistic Design Method for Human-Robot InteractionProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661530(459-475)Online publication date: 1-Jul-2024
  • (2024)[e]Motion: Designing Expressive Movement in Robots and Actuated Tangible User InterfacesProceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3623509.3634741(1-3)Online publication date: 11-Feb-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Journal of Human-Robot Interaction
Journal of Human-Robot Interaction  Volume 3, Issue 1
Special Issue on Design in HRI: Past, Present, and Future
February 2014
139 pages

Publisher

Journal of Human-Robot Interaction Steering Committee

Publication History

Published: 28 February 2014

Author Tags

  1. case studies
  2. design
  3. expressive movement
  4. gestures
  5. human-robot interaction
  6. movement
  7. non-anthropomorphic
  8. non-humanoid

Qualifiers

  • Research-article

Funding Sources

  • Hasso Plattner Design Thinking Research Fund

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)765
  • Downloads (Last 6 weeks)135
Reflects downloads up to 03 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Crafting for Emotion Appropriateness in Affective Robotics: Examining the Practicality of the OCC ModelProceedings of the ACM on Human-Computer Interaction10.1145/36764938:MHCI(1-19)Online publication date: 24-Sep-2024
  • (2024)Tangible Scenography as a Holistic Design Method for Human-Robot InteractionProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661530(459-475)Online publication date: 1-Jul-2024
  • (2024)[e]Motion: Designing Expressive Movement in Robots and Actuated Tangible User InterfacesProceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3623509.3634741(1-3)Online publication date: 11-Feb-2024
  • (2024)Virtual Reality-based Human-Robot Interaction for Remote Pick-and-Place TasksCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640748(1148-1152)Online publication date: 11-Mar-2024
  • (2024)How Robot Comes into Our Life in Urban Public Space: A Participatory StudyCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640638(292-296)Online publication date: 11-Mar-2024
  • (2024)Design Exploration of Robotic In-car Accessories for Semi-autonomous VehiclesCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640596(603-607)Online publication date: 11-Mar-2024
  • (2024)Generative Expressive Robot Behaviors using Large Language ModelsProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634999(482-491)Online publication date: 11-Mar-2024
  • (2024)A Social Approach for Autonomous Vehicles: A Robotic Object to Enhance Passengers' Sense of Safety and TrustProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634998(86-95)Online publication date: 11-Mar-2024
  • (2024)"Sorry to Keep You Waiting": Recovering from Negative Consequences Resulting from Service Robot Unintended RejectionProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634959(96-105)Online publication date: 11-Mar-2024
  • (2024)Exploring Empathetic Interactions: The Impact of Sound and Reactions in Human-Robot Relations Among University StudentsHuman-Computer Interaction10.1007/978-3-031-60412-6_2(19-28)Online publication date: 29-Jun-2024
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media