Abstract
MuMMER (MultiModal Mall Entertainment Robot) is a four-year, EU-funded project with the overall goal of developing a humanoid robot (SoftBank Robotics’ Pepper robot being the primary robot platform) with the social intelligence to interact autonomously and naturally in the dynamic environments of a public shopping mall, providing an engaging and entertaining experience to the general public. Using co-design methods, we will work together with stakeholders including customers, retailers, and business managers to develop truly engaging robot behaviours. Crucially, our robot will exhibit behaviour that is socially appropriate and engaging by combining speech-based interaction with non-verbal communication and human-aware navigation. To support this behaviour, we will develop and integrate new methods from audiovisual scene processing, social-signal processing, high-level action selection, and human-aware robot navigation. Throughout the project, the robot will be regularly deployed in Ideapark, a large public shopping mall in Finland. This position paper describes the MuMMER project: its needs, the objectives, R&D challenges and our approach. It will serve as reference for the robotics community and stakeholders about this ambitious project, demonstrating how a co-design approach can address some of the barriers and help in building follow-up projects.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Annear, S.: Makers of the world’s ‘first family robot’ just set a new crowdfunding record. Boston Daily. http://www.bostonmagazine.com/news/blog/2014/07/23/jibo-raises-1-million-six-days-record/
Barker, J.: Towards human-robot speech communication in everyday environments. In: Invited Talk presented at the ICSR 2013 Workshop on Robots in Public Spaces, October 2013
Bauer, A., Klasing, K., Lidoris, G., Mühlbauer, Q., Rohrmüller, F., Sosnowski, S., Xu, T., Kühnlenz, K., Wollherr, D., Buss, M.: The autonomous city explorer: Towards natural human-robot interaction in urban environments. Int. J. Soc. Robot. 1(2), 127–140 (2009)
Bibby, C., Reid, I.: Robust real-time visual tracking using pixel-wise posteriors. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) ECCV 2008. LNCS, vol. 5303, pp. 831–844. Springer, Heidelberg (2008). doi:10.1007/978-3-540-88688-4_61
Bibby, C., Reid, I.D.: Real-time tracking of multiple occluding objects using level sets. In: CVPR, pp. 1307–1314 (2010)
Clodic, A., Fleury, S., Alami, R., Chatila, R., Bailly, G., Brethes, L., Cottret, M., Danes, P., Dollat, X., Elisei, F., Ferrane, I., Herrb, M., Infantes, G., Lemaire, C., Lerasle, F., Manhes, J., Marcoul, P., Menezes, P., Montreuil, V.: Rackham: An interactive robot-guide. Proc. RO-MAN 2006, 502–509 (2006)
Dautenhahn, K.: Socially intelligent robots: dimensions of human-robot interaction. Philos. Trans. Royal Soc. B: Biol. Sci. 362(1480), 679–704 (2007)
Dondrup, C., Hanheide, M.: Qualitative constraints for human-aware robot navigation using velocity costmaps. In: Proceedings RO-MAN (2016)
Hebesberger, D., Dondrup, C., Koertner, T., Gisinger, C., Pripfl, J.: Lessons learned from the deployment of a long-term autonomous robot as companion in physical therapy for older adults with dementia: A mixed methods study. In: The Eleventh ACM/IEEE International Conference on Human Robot Interaction, pp. 27–34 (2016)
Hurley, M., Dennett, D., Adams Jr., R.: Inside Jokes. MIT Press, Cambridge (2011)
Kanda, T., Shiomi, M., Miyashita, Z., Ishiguro, H., Hagita, N.: A communication robot in a shopping mall. Robot. IEEE Trans. 26(5), 897–913 (2010)
Karkaletsis, V., Konstantopoulos, S., Bilidas, D., Vogiatzis, D.: INDIGO project: personality and dialogue enabled cognitive robots. In: Makedon, F., Maglogiannis, I., Kapidakis, S. (eds.) Proceedings of the 3rd International Conference on Pervasive Technologies Related to Assistive Environments, PETRA 2010 (2010). doi:10.1145/1839294.1839376
Keizer, S., Foster, M.E., Wang, Z., Lemon, O.: Machine learning for social multi-party human-robot interaction. ACM Trans. Intell. Interact. Syst. 4(3), 1–32 (2014)
Kruse, T., Pandey, A.K., Alami, R., Kirsch, A.: Human-aware robot navigation: A survey. Robot. Auton. Syst. 61(12), 1726–1743 (2013)
Lemon, O., Pietquin, O. (eds.): Data-driven Methods for Adaptive Spoken Dialogue Systems: Computational Learning for Conversational Interfaces. Springer, Heidelberg (2012)
Sheikhi, S., Khalidov, V., Klotz, D., Wrede, B., Odobez, J.M.: Leveraging the robot dialog state for visual focus of attention recognition. In: Proceedings of International Conference on Multimodal Interfaces (ICMI) (2013)
Shotton, J., Fitzgibbon, A., Cook, M., Sharp, T., Finocchio, M., Kipman, A., Blake, A.: Real-time human pose recognition in parts from single depth images. In: CVPR (2011)
SoftBank Robotics Corp: Press releases (2016). http://www.softbank.jp/en/corp/group/sbr/news/press/2016/
Triebel, R., et al.: SPENCER: a socially aware service robot for passenger guidance and help in busy airports. In: Wettergreen, David, S., Barfoot, Timothy, D. (eds.) Field and Service Robotics. STAR, vol. 113, pp. 607–622. Springer, Heidelberg (2016). doi:10.1007/978-3-319-27702-8_40
Weiss, A., Igelsbock, J., Tscheligi, M., Bauer, A., Kuhnlenz, K., Wollherr, D., Buss, M.: Robots asking for directions: The willingness of passers-by to support robots. Proc. HRI 2010, 23–30 (2010)
Yu, Y., Eshghi, A., Lemon, O.: Training an adaptive dialogue policy for interactive learning of visually grounded word meanings. In: Proceedings SIGDIAL (2016)
Acknowledgements
This research has been partially funded by the European Union’s Horizon 2020 research and innovation programme under grant agreement no. 688147 (MuMMER, mummer-project.eu).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing AG
About this paper
Cite this paper
Foster, M.E. et al. (2016). The MuMMER Project: Engaging Human-Robot Interaction in Real-World Public Spaces. In: Agah, A., Cabibihan, JJ., Howard, A., Salichs, M., He, H. (eds) Social Robotics. ICSR 2016. Lecture Notes in Computer Science(), vol 9979. Springer, Cham. https://doi.org/10.1007/978-3-319-47437-3_74
Download citation
DOI: https://doi.org/10.1007/978-3-319-47437-3_74
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-47436-6
Online ISBN: 978-3-319-47437-3
eBook Packages: Computer ScienceComputer Science (R0)