Abstract
The receptionist job, consisting in providing useful indications to visitors in a public office, is one possible employment of social robots. The design and the behaviour of robots expected to be integrated in human societies are crucial issues, and they are dependent on the culture and society in which the robot should be deployed. We study the factors that could be used in the design of a receptionist robot in Brazil, a country with a mix of races and considerable gaps in economic and educational level. This inequality results in the presence of functional illiterate people, unable to use reading, writing and numeracy skills. We invited Brazilian people, including a group of functionally illiterate subjects, to interact with two types of receptionists differing in physical appearance (agent v mechanical robot) and in the sound of the voice (human like v mechanical). Results gathered during the interactions point out a preference for the agent, for the human-like voice and a more intense reaction to stimuli by illiterates. These results provide useful indications that should be considered when designing a receptionist robot, as well as insights on the effect of illiteracy in the interaction.
References
[1] T. Fong, I. Nourbakhsh, and K. Dautenhahn, A survey of socially interactive robots, Robotics and Autonomous Systems, 42, 3-4 (2003) 143-166Suche in Google Scholar
[2] D. Feil-Seifer and M. J. Mataric, Defining socially assistive robotics, in 9th International Conference on Rehabilitation Robotics (2005) 465-468.Suche in Google Scholar
[3] R. Gockley, A. Bruce, J. Forlizzi, M. Michalowski, A. Mundell, S. Rosenthal, B. Sellner, R. Simmons, K. Snipes, A. Schultz, and J. Wang, Designing robots for long-term social interaction, in 2005 IEEE/RSJ International Conference on Intelligent Robots (2005) 1338-1343.10.1109/IROS.2005.1545303Suche in Google Scholar
[4] The Big Picture: This hotel is staffed by robots. IEEE Spectrum, 52, 10 (2015) 18-19Suche in Google Scholar
[5] T. Hashimoto, S. Hitramatsu, T. Tsuji, and H. Kobayashi, Development of the Face Robot SAYA for Rich Facial Expressions, in SICE-ICASE 2006 International Joint Conference (2006) 5423-5428.10.1109/SICE.2006.315537Suche in Google Scholar
[6] T. Ando, A. Araki, M. Kanoh, Y. Tomoto, and T. Nakamura, Relationship Between Mechadroid Type C3 and Human Beings Based on Physiognomic Features, Journal of Advanced Computational Intelligence and Intelligent Informatics, 14, 7 (2010) 869-87610.20965/jaciii.2010.p0869Suche in Google Scholar
[7] IBGE, Censo Demográfico 2010: Características da Populaçăo e dos Domicílios - Resultados do Universo. IBGE, Rio de Janeiro. ISSN / ISBN: 01043145 (in Portuguese), 2010Suche in Google Scholar
[8] IBGE, Pesquisa Nacional por Amostra de Domicilios: Sintese de indiadores: 2012. IBGE, Rio de Janeiro.Suche in Google Scholar
[9] UNESCO, Education for All; Literacy for life. The EFA Global Monitoring Report 2006, UNESCO Publishing 6:158.Suche in Google Scholar
[10] A. Cree, A. Kay, and J. Steward, The Economic and Social Cost of Illiteracy; A Snapshot of Illiteracy in a Global Context. Grandville, MI: World Literacy Foundation, 2012.Suche in Google Scholar
[11] P. Flandorfer, Population Ageing and Socially Assistive Robots for Elderly Persons: The Importance of Sociodemographic Factors for User Acceptance, International Journal of Population Research (2012) p. e829835Suche in Google Scholar
[12] S. Reppou and G. Karagiannis, Social Inclusion with Robots: A RAPP Case Study Using NAO for Technology Illiterate Elderly at Ormylia Foundation, in Progress in Automation, Robotics and Measuring Techniques, R. Szewczyk, C. Zieliński, and M. Kaliczyńska (Ed.) Springer International Publishing, 2015, 233-241.10.1007/978-3-319-15847-1_23Suche in Google Scholar
[13] C. F. DiSalvo, F. Gemperle, J. Forlizzi, and S. Kiesler, All Robots Are Not Created Equal: The Design and Perception of Humanoid Robot Heads, in Proceedings of the 4th Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques, New York, NY, USA (2002) 321-326.Suche in Google Scholar
[14] W. A. Bainbridge, J. W. Hart, E. S. Kim, and B. Scassellati, The Benefits of Interactions with Physically Present Robots over Video-Displayed Agents, Int J of Soc Robotics, 3, 1, (2011) 41-5210.1007/s12369-010-0082-7Suche in Google Scholar
[15] J. Li, The benefit of being physically present: A survey of experimental works comparing copresent robots, telepresent robots and virtual agents, International Journal of Human-Computer Studies, 77 (2015) 23-37Suche in Google Scholar
[16] T. Minato, M. Shimada, H. Ishiguro, and S. Itakura, Development of an Android Robot for Studying Human-Robot Interaction, in 17th International Conference on Industrial and Engineering Applications of Artificial Intelligence and Expert Systems (2004) 424-43410.1007/978-3-540-24677-0_44Suche in Google Scholar
[17] M. L.Walters, D. S. Syrdal, K. Dautenhahn, R. te Boekhorst, and K. L. Koay, Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking homescenario for a robot companion, Auton Robot, 24, 2 (2008) 159-17810.1007/s10514-007-9058-3Suche in Google Scholar
[18] F. Hegel, M. Lohse, and B. Wrede, Effects of visual appearance on the attribution of applications in social robotics, in The 18th IEEE International Symposium on Robot and Human Interactive Communication (2009) 64-71.10.1109/ROMAN.2009.5326340Suche in Google Scholar
[19] F. Kaplan, Who is afraid of the humanoid? Investigating cultural differences in the acceptance of robots, International Journal of Humanoid Robotics, 1, 3 (2004) 465-48010.1142/S0219843604000289Suche in Google Scholar
[20] M. L. Walters, D. S. Syrdal, K. L. Koay, K. Dautenhahn, and R. teBoekhorst, Human approach distances to a mechanicallooking robot with different robot voice styles, in The 17th IEEE International Symposiumon Robot and Human Interactive Communication (2008) 707-712.10.1109/ROMAN.2008.4600750Suche in Google Scholar
[21] A. P. Saygin, T. Chaminade, H. Ishiguro, J. Driver, and C. Frith, The thing that should not be: predictive coding and the uncanny valley in perceiving human and humanoid robot actions, Soc- Cogn Affect Neurosci, 7, 4 (2012) 413-422Suche in Google Scholar
[22] F. Eyssel, D. Kuchenbrandt, S. Bobinger, L. de Ruiter, and F. Hegel, ‘If You Sound Like Me, You Must Be More Human’: On the Interplay of Robot and User Features on Human-robot Acceptance and Anthropomorphism, in Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction, New York, NY, USA (2012) 125-126Suche in Google Scholar
[23] A. Niculescu, B.van Dijk, A. Nijholt, S.L. See, The influence of voice pitch on the evaluation of a social robot receptionist. International Conference on User Science and Engineering (i- USEr). IEEE (2011) 18-2310.1109/iUSEr.2011.6150529Suche in Google Scholar
[24] M. Scheutz and P. Schermerhorn, Affective Goal and Task Selection for Social Robots, in Handbook of Research on Synthetic Emotions and Sociable Robotics: New Applications in Affective Computing and Artificial Intelligence, Jordi Vallverdú and D. Casacuberta (Ed.) IGI Global, 200910.4018/978-1-60566-354-8.ch005Suche in Google Scholar
[25] Northwest Territories Department of Municipal and Community Affairs, Job Description Receptionist, 2005.Suche in Google Scholar
[26] G. Trovato, M. Zecca, S. Sessa, L. Jamone, J. Ham, K. Hashimoto, and A. Takanishi, Cross-cultural study on human-robot greeting interaction: acceptance and discomfort by Egyptians and Japanese, Paladyn. International Journal of Behavioral Robotics, 4, 2 (2013) 83-9310.2478/pjbr-2013-0006Suche in Google Scholar
[27] P. Holthaus and K. Pitsch, How Can I Help? - Spatial Attention Strategies for a Receptionist Robot., I. J. Social Robotics, 3 (2011) 383-39310.1007/s12369-011-0108-9Suche in Google Scholar
[28] M. Salem, M. Ziadee, and M. Sakr, Marhaba, How May I Help You?: Effects of Politeness and Culture on Robot Acceptance and Anthropomorphization, in Proceedings of the 2014 ACM/IEEE International Conference on Human-robot Interaction, New York, NY, USA, (2014) 74-8110.1145/2559636.2559683Suche in Google Scholar
[29] B. T. T. Chee, A. H. Y. Wong, D. K. Limbu, A. H. J. Tay, Y. K. Tan, and T. Park, UnderstandingCommunication Patterns for Designing Robot Receptionist, in Social Robotics, S. S. Ge, H. Li, J.-J. Cabibihan, and Y. K. Tan, Eds. Springer Berlin Heidelberg (2010) 345-354.10.1007/978-3-642-17248-9_36Suche in Google Scholar
[30] D. Ribeiro, The Brazilian People: The Formation and Meaning of Brazil. Gainesville: University Press of Florida, 2000.Suche in Google Scholar
[31] M.Makatchev, R. Simmons, M. Sakr, and M. Ziadee, Expressing Ethnicity through Behaviors of a Robot Character, in Proceedings of the 8th ACM/IEEE international conference on Humanrobot interaction (2013) 357-364. 10.1109/HRI.2013.6483610Suche in Google Scholar
[32] J. M. DeMartino, L. PiniMagalhăes, and F. Violaro, Facial animation based on context-dependent visemes, Computers & Graphics, 30, 6 (2006) 971-98010.1016/j.cag.2006.08.017Suche in Google Scholar
[33] Ministry of Labor; RAIS - Annual Report of Social Information (in Portuguese)Suche in Google Scholar
[34] L. Yin, T. Bickmore, D. Byron, and D. Cortes, Cultural and linguistic adaptation of relational agents for health counseling, in Workshop on Interactive Systems in Healthcare, 2010.Suche in Google Scholar
[35] C. Nass, K. .Isbister, and E.-J. Lee, Embodied Conversational Agents, Cambridge, MA, USA: MIT Press (2000) 374-402.Suche in Google Scholar
[36] N. Endo and A. Takanishi, Development of Whole-Body Emotional Expression Humanoid Robot for ADL-Assistive RT Services, Journal of Robotics and Mechatronics, 23, 6 (2011) 969-97710.20965/jrm.2011.p0969Suche in Google Scholar
[37] F. Eyssel and F. Hegel, (S)he’s Got the Look: Gender Stereotyping of Robots1, Journal of Applied Social Psychology, 42, 9 (2012) 2213-223010.1111/j.1559-1816.2012.00937.xSuche in Google Scholar
[38] G. Metta, P. Fitzpatrick, and L. Natale, YARP: Yet Another Robot Platform, International Journal of Advanced Robotic Systems, 200610.5772/5761Suche in Google Scholar
[39] F. Eyssel and D. Kuchenbrandt, Social categorization of social robots: Anthropomorphism as a function of robot group membership, British Journal of Social Psychology, 51, 4 (2012) 724-73110.1111/j.2044-8309.2011.02082.xSuche in Google Scholar PubMed
[40] D. Harman, Illiteracy: An Overview, Harvard Educ Rev,May 197010.17763/haer.40.2.h62717587734l753Suche in Google Scholar
[41] C. Bartneck, D. Kulić, E. Croft, and S. Zoghbi, Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots, Int J of Soc Robotics, 1, 1 (2009) 71-8110.1007/s12369-008-0001-3Suche in Google Scholar
[42] S. Freud, Das Unheimliche (The Uncanny). Europäischer Literaturverlag, 2013Suche in Google Scholar
[43] W. H. Kruskal and W. A. Wallis, Use of Ranks in One-Criterion Variance Analysis, Journal of the American Statistical Association, 47, 260 (1952) 583-621, 1952.10.1080/01621459.1952.10483441Suche in Google Scholar
[44] H. B. Mann and D. R. Whitney, On a Test of Whether one of Two Random Variables is Stochastically Larger than the Other, Ann. Math. Statist., 18, 1 (1947) 50-6010.1214/aoms/1177730491Suche in Google Scholar
[45] E. Broadbent, V. Kumar, X. Li, J. Sollers 3rd, R. Q. Stafford, B. A. MacDonald, and D. M. Wegner, Robots with Display Screens: A Robot with a More Humanlike Face Display Is Perceived To Have More Mind and a Better Personality, PLoS ONE, 8, 8 (2013) p. e7258910.1371/journal.pone.0072589Suche in Google Scholar PubMed PubMed Central
[46] P. Saulnier, E. Sharlin, and S. Greenberg, Exploring interruption in HRI usingWizard of Oz, in the 5th ACM/IEEE International Conference on Human-Robot Interaction (2010) 125-12610.1109/HRI.2010.5453232Suche in Google Scholar
[47] L. A. Carlson, C. Hölscher, T. F. Shipley, and R. C. Dalton, Getting Lost in Buildings, Current Directions in Psychological Science, 19, 5 (2010) 284-28910.1177/0963721410383243Suche in Google Scholar
© 2017 Gabriele Trovato et al
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.