Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Humanizing Chatbots for Interactive Marketing

  • Chapter
  • First Online:
The Palgrave Handbook of Interactive Marketing

Abstract

Chatbots, also known as conversational agents, are automated computer programs powered by natural language processing designed to engage consumers in interactive, one-on-one, personalized text- or voice-based conversations. Focusing on text-based, anthropomorphic social chatbots that can be easily implemented on various digital platforms, this chapter synthesizes the literature on computer-mediated communication and human–computer interaction to provide a comprehensive review of the pivotal factors that can enhance chatbots’ perceived humanness and the key considerations in consumer–chatbot interaction and relationship. Specifically, this chapter first discusses the research findings on the persuasiveness of computer-controlled chatbots in relation to human-controlled avatars. Then, the chapter delves into the various anthropomorphic cues used in chatbot design and messaging, including human identity cues, verbal, and non-verbal cues. Strategies and examples for chatbots to communicate humility in order to mitigate consumers’ frustration when chatbots fail to meet consumers’ expectations are also provided. The chapter next addresses some of the most widely studied mediators of chatbot anthropomorphism—social presence and parasocial interaction—and the under-researched role of emotion in consumer–chatbot interactions. The chapter concludes with a discussion of the “uncanny valley” effect pertaining to people’s feelings of eeriness toward highly human-like chatbots.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 249.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  • Aaker, J. L. (1997). Dimensions of brand personality. Journal of Marketing Research, 34(3), 347–356.

    Article  Google Scholar 

  • Adam, M., Wessel, M., & Benlian, A. (2021). AI-based chatbots in customer service and their effects on user compliance. Electronic Markets, 31(2), 427–445.

    Article  Google Scholar 

  • Allport, G. W. (1985). The historical background of social psychology, The Handbook of Social Psychology, 1, 1–46.

    Google Scholar 

  • Araujo, T. (2018). Living up to the chatbot hype: The influence of anthropomorphic design cues and communicative agency framing on conversational agent and company perceptions. Computers in Human Behavior, 85, 183–189.

    Article  Google Scholar 

  • Arsenyan, J., & Mirowska, A. (2021). Almost human? A comparative case study on the social media presence of virtual influencers International Journal of Human-Computer Studies, 155, pp. 102694.

    Google Scholar 

  • Ashktorab, Z., Jain, M., Liao, Q. V., & Weisz, J. D. (2019). Resilient chatbots: Repair strategy preferences for conversational breakdowns, In Proceedings of the 2019 CHI conference on human factors in computing systems, pp. 1–12.

    Google Scholar 

  • Blascovich, J. (2002). Social influence within immersive virtual environments, In R. Schroeder (Ed.), The Social Life of Avatars: Presence and Interaction in Shared Virtual Environments pp. 127–145. Springer London.

    Google Scholar 

  • Blut, M., Wang, C., Wünderlich, N. V., & Brock, C. (2021). Understanding anthropomorphism in service provision: A meta-analysis of physical robots, chatbots, and other AI. Journal of the Academy of Marketing Science, 49(4), 1–27.

    Article  Google Scholar 

  • Cancel, D. and Gerhardt, D. (2019). Conversational Marketing, Wiley.

    Google Scholar 

  • Chen, R. P., Wan, E. W., & Levy, E. (2017). The effect of social exclusion on consumer preference for anthropomorphized brands. Journal of Consumer Psychology, 27(1), 23–34.

    Article  Google Scholar 

  • Cheng, Y., & Jiang, H. (2021). Customer–Brand relationship in the era of artificial intelligence: Understanding the role of chatbot marketing efforts. Journal of Product & Brand Management. Advance Online Publication. https://doi.org/10.1108/JPBM-05-2020-2907

    Article  Google Scholar 

  • Ciechanowski, L., Przegalinska, A., Magnuski, M., & Gloor, P. (2019). In the shades of the uncanny valley: An experimental study of human–Chatbot interaction. Future Generation Computer Systems, 92, 539–548.

    Article  Google Scholar 

  • Colliander, J., & Dahlén, M. (2011). Following the fashionable friend: The power of social media: Weighing publicity effectiveness of blogs versus online magazines. Journal of Advertising Research, 51(1), 313–320.

    Article  Google Scholar 

  • Corti, K., & Gillespie, A. (2016). Co-constructing intersubjectivity with artificial conversational agents: People are more likely to initiate repairs of misunderstandings with agents represented as human Computers in Human Behavior, 58, 431–442.

    Google Scholar 

  • Crolic, C., Thomaz, F., Hadi, R., & Stephen, A. T. (2022). Blame the bot: Anthropomorphism and anger in customer–chatbot interactions. Journal of Marketing, 86(1), 132–148.

    Article  Google Scholar 

  • Croes, E. A., & Antheunis, M. L. (2021). Can we be friends with Mitsuku? A longitudinal study on the process of relationship formation between humans and a social chatbot, Journal of Social and Personal Relationships, 38(1), 279–300.

    Google Scholar 

  • Chung, M., Ko, E., Joung, H., & Kim, S. J. (2018). Chatbot e-service and customer satisfaction regarding luxury brands. Journal of Business Research, 117, 587–595.

    Article  Google Scholar 

  • Dahl, D. W., Manchanda, R. V., & Argo, J. J. (2001). Embarrassment in consumer purchase: The roles of social presence and purchase familiarity. The Journal of Consumer Research, 28(3), 473–481.

    Article  Google Scholar 

  • Davies, G., Chun, R., da Silva, R. V., & Roper, S. (2004). A corporate character scale to assess employee and customer views of organization reputation. Corporate Reputation Review, 7(2), 125–146.

    Article  Google Scholar 

  • De Gennaro, M., Krumhuber, E. G., & Lucas, G. (2020). Effectiveness of an empathic chatbot in combating adverse effects of social exclusion on mood, Frontiers in Psychology, 10, 3061.

    Google Scholar 

  • De Visser, E. J., Monfort, S. S., McKendrick, R., Smith, M. A., McKnight, P. E., Krueger, F., & Parasuraman, R. (2016). Almost human: Anthropomorphism increases trust resilience in cognitive agents. Journal of Experimental Psychology: Applied, 22(3), 331.

    Google Scholar 

  • Dietvorst, B. J., Simmons, J. P., & Massey, C. (2015). Algorithm aversion: People erroneously avoid algorithms after seeing them err. Journal of Experimental Psychology, 144(1), 114.

    Article  Google Scholar 

  • Dietvorst, B. J., Simmons, J. P., & Massey, C. (2018). Overcoming algorithm aversion: People will use imperfect algorithms if they can (even slightly) modify them. Management Science, 64(3), 1155–1170.

    Article  Google Scholar 

  • Federal Trade Commission (2017). Privacy & data security update (2016). Federal Trade Commission, https://www.ftc.gov/reports/privacy-data-security-update-2017-overview-commissions-enforcement-policy-initiatives-consumer

  • Feine, J., Gnewuch, U., Morana, S., & Maedche, A. (2019). A taxonomy of social cues for conversational agents International Journal of Human-Computer Studies, 132, 138–161.

    Google Scholar 

  • Forbes Magazine (2013, August 2). 7 Steps For Dealing With Angry Customers. https://www.forbes.com/sites/thesba/2013/08/02/7-steps-for-dealing-with-angry-customers/

  • Foster, J. K., McLelland, M. A. and Wallace, L. K. (2021). Brand avatars: impact of social interaction on consumer–brand relationships, Journal of Research in Interactive Marketing, Vol. ahead-of-print No. ahead-of-print. https://doi.org/10.1108/JRIM-01-2020-0007

  • Funches, V. (2011). The consumer anger phenomena: Causes and consequences. Journal of Professional Services Marketing, 25(6), 420–428.

    Article  Google Scholar 

  • Fox, J., Arena, D., & Bailenson, J. N. (2009). Virtual reality: A survival guide for the social scientist. Journal of Media Psychology, 21(3), 95–113.

    Article  Google Scholar 

  • Go, E., & Sundar, S. S. (2019). Humanizing chatbots: The effects of visual, identity and conversational cues on humanness perceptions. Computers in Human Behavior, 97, 304–316.

    Article  Google Scholar 

  • Hill, J., Randolph Ford, W., & Farreras, I. G. (2015). Real conversations with artificial intelligence: A comparison between human–human online conversations and human–chatbot conversations. Computers in Human Behavior, 49, 245–250.

    Article  Google Scholar 

  • Ho, A., Hancock, J., & Miner, A. S. (2018). Psychological, relational, and emotional effects of self-disclosure after conversations with a chatbot. The Journal of Communication, 68(4), 712–733.

    Article  Google Scholar 

  • Horton, D., & Richard Wohl, R. (1956). Mass communication and para-social interaction: Observations on intimacy at a distance. Psychiatry, 19(3), 215–229.

    Article  Google Scholar 

  • Huang, M. H., & Rust, R. T. (2018). Artificial intelligence in service. Journal of Service Research, 21(2), 155–172.

    Article  Google Scholar 

  • Jin, S. V., & Youn, S. (2021). Why do consumers with social phobia prefer anthropomorphic customer service chatbots? Evolutionary explanations of the moderating roles of social phobia. Telematics and Informatics, 62, 101644.

    Article  Google Scholar 

  • Juniper Research (2020). Chatbots to facilitates $142 billion of retail spend by 2024, driven by omnichannel strategies, available at https://doi.org/www.juniperresearch.com/press/press-releases/chatbots-to-facilitate-$142-billion-of-retail

  • Kalpokas, I., Kalpokas, I., & Finotello. (2019). A Political Theory of Post-Truth. Palgrave Macmillan.

    Book  Google Scholar 

  • Katsanis, L. P. (1994). Do unmentionable products still exist?: An empirical investigation. Journal of Product & Brand Management, 3(4), 5–14.

    Article  Google Scholar 

  • Kim, L. (2020). 9 ways to use Facebook chatbots to grow your business, available at https://www.allbusiness.com/use-facebook-chatbots-to-grow-your-business-132318-1.html

  • Kim, S. Y., Schmitt, B. H., & Thalmann, N. M. (2019). Eliza in the Uncanny Valley: Anthropomorphizing consumer robots increases their perceived warmth but decreases liking. Marketing Letters, 30(1), 1–12.

    Article  Google Scholar 

  • Kim, Y., & Sundar, S. S. (2012). Anthropomorphism of computers: Is it mindful or mindless? Computers in Human Behavior, 28(1), 241–250.

    Article  Google Scholar 

  • Kontogiorgos, D., Pereira, A., Andersson, O., Koivisto, M., Gonzalez Rabal, E., Vartiainen, V., & Gustafson, J. (2019, July). The effects of anthropomorphism and non-verbal social behaviour in virtual assistants. In Proceedings of the 19th ACM International Conference on Intelligent Virtual Agents, pp. 133–140.

    Google Scholar 

  • Lee, K. M., Peng, W., Jin, S. A., & Yan, C. (2006). Can robots manifest personality?: An empirical test of personality recognition, social responses, and social presence in human–robot interaction. Journal of Communication, 56(4), 754–772.

    Article  Google Scholar 

  • Liebrecht, C., & van Hooijdonk, C. (2020). Creating humanlike chatbots: What chatbot developers could learn from webcare employees in adopting a conversational human voice In A. Følstad, T. Araujo, S. Papadopoulos, E. L-C. Law, O-C. Granmo, E. Luger, & P. B. Brandtzaeg (Eds.), Chatbot Research and Design: Third International Workshop, CONVERSATIONS 2019, Amsterdam, The Netherlands, November 19–20, 2019, Revised Selected Papers (pp. 51–64). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)); Vol. 11970. Springer.

    Google Scholar 

  • Liew, T. W., & Tan, S. M. (2021). Social cues and implications for designing expert and competent artificial agents: A systematic review. Telematics and Informatics, 65, 101721.

    Article  Google Scholar 

  • Ling, E. C., Tussyadiah, I., Tuomi, A., Stienmetz, J., & Ioannou, A. (2021). Factors influencing users’ adoption and use of conversational agents: A systematic review. Psychology & Marketing, 38(7), 1031–1051.

    Article  Google Scholar 

  • Liu, B., & Sundar, S. S. (2018). Should machines express sympathy and empathy? Experiments with a health advice chatbot. Cyberpsychology, Behavior, and Social Networking, 21(10), 625–636.

    Article  Google Scholar 

  • Loveys, K., Sebaratnam, G., Sagar, M., & Broadbent, E. (2020). The effect of design features on relationship quality with embodied conversational agents: A systematic review. International Journal of Social Robotics, 12(6), 1293–1312.

    Article  Google Scholar 

  • Lucas, G. M., Gratch, J., King, A., & Morency, L.-P. (2014). It’s only a computer: Virtual humans increase willingness to disclose. Computers in Human Behavior, 37, 94–100.

    Article  Google Scholar 

  • Luger, E., & Sellen, A. (2016, May). Like having a really bad PA the gulf between user expectation and experience of conversational agents. In Proceedings of the 2016 CHI conference on human factors in computing systems, pp. 5286–5297.

    Google Scholar 

  • Luo, X., Tong, S., Fang, Z., & Qu, Z. (2019). Frontiers: Machines vs. humans: The impact of artificial intelligence chatbot disclosure on customer purchases. Marketing Science, 38(6), 937–947.

    Google Scholar 

  • Mara, M., Appel, M., & Gnambs, T. (2022). Human-like robots and the uncanny valley: A meta-analysis of user responses based on the godspeed scales. Zeitschrift Für Psychologie, 230(1), 33.

    Article  Google Scholar 

  • Mehra, B. (2021). Chatbot personality preferences in Global South urban English speakers. Social Sciences & Humanities Open, 3(1), 100131.

    Article  Google Scholar 

  • Mims, C. (2014). Advertising’s new frontier: talk to the bot, The Wall Street Journal, available at https://www.wsj.com/articles/advertisings-new-frontier-talk-to-the-bot-1406493740?mod=dist_smartbrief

  • Mozafari, N., Weiger, W. H., & Hammerschmidt, M. (2021). Trust me, I’m a bot–repercussions of chatbot disclosure in different service frontline settings. Journal of Service Management, 33(2), 221–245.

    Article  Google Scholar 

  • Mou, Y., & Xu, K. (2017). The media inequality: Comparing the initial human-human and human-AI social interactions. Computers in Human Behavior, 72, 432–440.

    Article  Google Scholar 

  • Nadarzynski, T., Miles, O., Cowie, A., & Ridge, D. (2019). Acceptability of artificial intelligence (AI)-led chatbot services in healthcare: A mixed-methods study. Digital Health, 5(1), https://doi.org/10.1177/2055207619871808.

    Google Scholar 

  • Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. In Journal of Social Issues, 56(1), 81–103.

    Article  Google Scholar 

  • Nass, C., Moon, Y., Fogg, B. J., Reeves, B., Dryer, D. C. (1995, May). Can computer personalities be human personalities? CHI ’95: Conference companion on human factors in computing systems, pp. 228–229.

    Google Scholar 

  • Nowak, K. L., & Biocca, F. (2003). The effect of the agency and anthropomorphism on users’ sense of telepresence, copresence, and social presence in virtual environments. Presence: Teleoperators and Virtual Environments, 12(5), 481–494.

    Google Scholar 

  • Payne, E. M., Peltier, J., & Barger, V. A. (2021). Enhancing the value co-creation process: Artificial intelligence and mobile banking service platforms. Journal of Research in Interactive Marketing, 15(1), 68–85.

    Article  Google Scholar 

  • Pizzi, G., Scarpi, D., & Pantano, E. (2021). Artificial intelligence and the new forms of interaction: Who has the control when interacting with a chatbot? Journal of Business Research, 129, 878–890.

    Article  Google Scholar 

  • Powers, A., & Kiesler, S. (2006). The advisor robot: tracing people's mental model from a robot's physical attributes. In Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction, pp. 218–225.

    Google Scholar 

  • Qiu, L., & Benbasat, I. (2009). Evaluating anthropomorphic product recommendation agents: A social relationship perspective to designing information systems. Journal of Management Information Systems, 25(4), 145–182.

    Article  Google Scholar 

  • Ramerman, M. (2020). Five predictions for marketing in 2021, Forbes, available at https://www.forbes.com/sites/forbesagencycouncil/2020/09/23/five-predictions-for-marketing-in-2021/?sh=41c3fd084e8d

  • Rasmussen, L. (2018). Parasocial interaction in the digital age: An examination of relationship building and the effectiveness of YouTube celebrities. The Journal of Social Media in Society, 7(1), 280–294.

    Google Scholar 

  • Redston, S., de Botte, S., & Smith, C. (2018). Resolving embarrassing medical conditions with online health information. International Journal of Medical Informatics, 114, 101–105.

    Article  Google Scholar 

  • Reeves, B., & Nass, C. I. (1996). The media equation: How people treat computers, television, and new media like real people and places. Cambridge University Press.

    Google Scholar 

  • Rheu, M., Shin, J. Y., Peng, W., & Huh-Yoo, J. (2021). Systematic review: Trust-building factors and implications for conversational agent design. International Journal of Human-Computer Interaction, 37(1), 81–96.

    Article  Google Scholar 

  • Roy, R., & Naidoo, V. (2021). Enhancing chatbot effectiveness: The role of anthropomorphic conversational styles and time orientation. Journal of Business Research, 126, 23–34.

    Article  Google Scholar 

  • Seeger, A. M., Pfeiffer, J., & Heinzl, A. (2021). Texting with humanlike conversational agents: Designing for anthropomorphism. Journal of the Association for Information Systems, 22(4), 8.

    Article  Google Scholar 

  • Segran, E. (2019, September 12). What teens are asking Roo, Planned Parenthood’s new sex-ed chatbot. Fast Company. https://www.fastcompany.com/90401983/what-teens-are-asking-roo-planned-parenthoods-new-sex-ed-chatbot.

  • Seyama, J. I., & Nagayama, R. S. (2007). The uncanny valley: Effect of realism on the impression of artificial human faces. Presence, 16(4), 337–351.

    Article  Google Scholar 

  • Sheehan, B., Jin, H. S., & Gottlieb, U. (2020). Customer service chatbots: Anthropomorphism and adoption. Journal of Business Research, 115, 14–24.

    Article  Google Scholar 

  • Short, J., Williams, E., & Christie, B. (1976). The Social Psychology of Telecommunications.

    Google Scholar 

  • Shum, H. Y., He, X. D., & Li, D. (2018). From Eliza to XiaoIce: Challenges and opportunities with social chatbots. Frontiers of Information Technology & Electronic Engineering, Col., 19(1), 10–26.

    Article  Google Scholar 

  • Skjuve, M., Følstad, A., Fostervold, K. I., & Brandtzaeg, P. B. (2021). My chatbot companion-a study of human-chatbot relationships. International Journal of Human-Computer Studies, 149, 102601.

    Article  Google Scholar 

  • Smith, A. (2020). CDC creates coronavirus chatbot called clara to check your symptoms. Entrepreneur. Retrieved 16 May 2022, from https://www.entrepreneur.com/article/348049.

  • Sundar, S. S. (2008). The MAIN model: A heuristic approach to understanding technology effects on credibility. In M. J. Metzger & A. J. Flanagin (Eds.), Digital Media, Youth, and Credibility (pp. 73–100). MIT Press.

    Google Scholar 

  • Sundar, S. S., Bellur, S., Oh, J., Jia, H., & Kim, H. S. (2016). Theoretical importance of contingency in human-computer interaction: Effects of message interactivity on user engagement. Communication Research, 43(5), 595–625.

    Article  Google Scholar 

  • Sung, Y., & Kim, J. (2010). Effects of brand personality on brand trust and brand affect. Psychology & Marketing, 27(7), 639–661.

    Article  Google Scholar 

  • Terdiman, J. P. (2006). Embarrassment is a major barrier to colon cancer prevention, especially among women: A call to action. In Gastroenterology, 130(4), 1364–1365.

    Google Scholar 

  • Tsai, W. H. S., Liu, Y., & Chuan, C. H. (2021a). How chatbots’ social presence communication enhances consumer engagement: the mediating role of parasocial interaction and dialogue. Journal of Research in Interactive Marketing, 15(3), 460–482.

    Google Scholar 

  • Tsai, W. H. S., Lun, D., Carcioppolo, N., & Chuan, C. H. (2021b). Human versus chatbot: Understanding the role of emotion in health marketing communication for vaccines. Psychology & Marketing, 38(12), 2377–2392.

    Article  Google Scholar 

  • Waddell, K. (2017, April 21). Chatbots have entered the uncanny valley, The Atlantic. https://www.theatlantic.com/technology/archive/2017/04/uncanny-valley-digital-assistants/523806/.

  • Wang, C. L. (2021). New frontiers and future directions in interactive marketing. Journal of Research in Interactive Marketing., 15(1), 1–9.

    Article  Google Scholar 

  • Weisband, S., & Kiesler, S. (1996). Self disclosure on computer forms: Meta-analysis and implications. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 3–10).

    Google Scholar 

  • Worthington, E. L., Jr. (1998). The pyramid model of forgiveness: Some interdisciplinary speculations about unforgiveness and the promotion of forgiveness. in Dimensions of Forgiveness: Psychological Research and Theological Perspectives, pp. 107–137. Radnor. PA. Templeton Foundation Press.

    Google Scholar 

  • Xu, K., & Liao, T. (2020). Explicating cues: A typology for understanding emerging media technologies. Journal of Computer-Mediated Communication, 25(1), 32–43.

    Article  Google Scholar 

  • Youn, S., & Jin, S. V. (2021). In AI we trust? The effects of parasocial interaction and technopian versus luddite ideological views on chatbot-based customer relationship management in the emerging feeling economy. Computers in Human Behavior, 119, 106721.

    Article  Google Scholar 

  • Verčič, A. T., & Verčič, D. (2007). Reputation as Matching Identities and Images: Extending Davies and Chun’s Research on Gaps between the Internal and External Perceptions of the Corporate Brand. Journal of Marketing Communications, 13(4), 277–290.

    Article  Google Scholar 

  • Zadro, L., Williams, K. D., & Richardson, R. (2004). How low can you go? Ostracism by a computer is sufficient to lower self-reported levels of belonging, control, self-Esteem, and meaningful existence. Journal of Experimental Social Psychology, 40(4), 560–567. https://doi.org/10.1016/j.jesp.2003.11.006.

    Article  Google Scholar 

  • Zamora, J. (2017, October). I'm sorry, dave, i'm afraid i can't do that: Chatbot perception and expectations. In Proceedings of the 5th international conference on human agent interaction, pp. 253–260.

    Google Scholar 

  • Zarouali, B., Van den Broeck, E., Walrave, M., & Poels, K. (2018). Predicting consumer responses to a chatbot on Facebook. Cyberpsychology, Behavior, and Social Networking, 21(8), 491–497.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wan-Hsiu Sunny Tsai .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Tsai, WH.S., Chuan, CH. (2023). Humanizing Chatbots for Interactive Marketing. In: Wang, C.L. (eds) The Palgrave Handbook of Interactive Marketing. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-031-14961-0_12

Download citation

Publish with us

Policies and ethics