Abstract
The authors have proposed a coordinated motor display system for called ARM-COMS (ARm-supported eMbodied COMmunication Monitor System) that detects the orientation of the subject’s head by face tracking through image processing technology and physically moves the monitor to mimic the subject. The idea behind this is that the remote subject’s avatar behaves as if it functions during video communication and interacts with the local subject. In addition, ARM-COMS responds appropriately with voice even when the remote subject’s head movements cannot be detected. Furthermore, ARM-COMS is a highly responsive system by responding to local subject’s voice. This paper introduces the basic concepts of ARM-COMS, describes how it was developed, and describes how the basic procedures were implemented in a prototype system. Then this paper introduces the results of teleconferencing experiments using ARM-COMS, and describes the findings obtained from the results, including the effect of physical interaction through ARM-COMS, camera shake problem, and evoking emotional projection in remote communication.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Anotation. https://atonaton.com/. Accessed 12 Feb 2023
Baltrušaitis, T., Robinson, P., Morency, L.-P.: OpenFace: an open source facial behavior analysis toolkit. In: 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), Lake Placid, NY, USA, pp. 1–10 (2016). https://doi.org/10.1109/WACV.2016.7477553
Bertrand, C., Bourdeau, L.: Research interviews by Skype: a new data collection method. In: Esteves, J. (ed.) Proceedings from the 9th European Conference on Research Methods, pp 70–79. IE Business School, Spain (2010)
Cabinet of Secretariat of Japan. https://corona.go.jp/en/. Accessed 12 Feb 2023
Dionisio, J.D.N., Burns III, W.G., Gilbert, R.: 3D virtual worlds and the metaverse: current status and future possibilities. ACM Comput. Surv. 45(3), 1–38 (2013). Article No 34. https://doi.org/10.1145/2480741.2480751
Dlib C++ libraty. http://dlib.net/. Accessed 12 Feb 2023
Ekman, P., Friesen, W.V.: The repertoire or nonverbal behavior: categories, origins, usage, and coding. Semiotica 1, 49–98 (1969)
Gerkey, B., Smart, W., Quigley, M.: Programming robots with ROS. O’Reilly Media (2015)
Ito, T., Watanabe, T.: Motion control algorithm of ARM-COMS for entrainment enhancement. In: Yamamoto, S. (ed.) HIMI 2016. LNCS, vol. 9734, pp. 339–346. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-40349-6_32
Ito, T., Kimachi, H., Watanabe, T.: Combination of local interaction with remote interaction in ARM-COMS communication. In: Yamamoto, S., Mori, H. (eds.) HCII 2019. LNCS, vol. 11570, pp. 347–356. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-22649-7_28
Ito, T., Oyama, T., Watanabe, T.: Smart speaker interaction through ARM-COMS for health monitoring platform. In: Yamamoto, S., Mori, H. (eds.) HCII 2021. LNCS, vol. 12766, pp. 396–405. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-78361-7_30
Kimachi, H., Ito, T., Watanabe, T.: Robotic arm control based on MQTT-based remote behavior communication. In: The Proceedings of Design & Systems Conference, vol. 27, Session ID 1206, p. 1206 (2017)
Kimachi, H., Ito, T.: Introduction of local interaction to head-motion based robot. In: The Proceedings of Design & Systems Conference. https://doi.org/10.1299/jsmedsd.2018.28.2204
Kubi. https://www.kubiconnect.com/. Accessed 18 Feb 2023
Kumar, A., Haider, Y., Kumar, M., et al.: Using WhatsApp as a quick-access personal logbook for maintaining clinical records and follow-up of orthopedic patients. Cureus 13(1), e12900 (2021). https://doi.org/10.7759/cureus.12900
Lee, A., Kawahara, T.: Recent development of open-source speech recognition engine Julius. In: Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC) (2009)
Lokshina, I., Lanting, C.: A qualitative evaluation of IoT-driven eHealth: knowledge management, business models and opportunities, deployment and evolution. In: Kryvinska, N., Greguš, M. (eds.) Data-Centric Business and Applications. LNDECT, vol. 20, pp. 23–52. Springer, Cham (2019). https://doi.org/10.1007/978-3-319-94117-2_2
Helmke, M., Joseph, E., Rey, J.A.: Official Ubuntu Book, Pearson, Edition 9 (2016)
Medical Alert Advice. www.medicalalertadvice.com. Accessed 12 Feb 2023
Quigley, M., Gerkey, B., Smart, W.D.: Programming Robots with ROS: A practical introduction to the Robot Operating System. O’Reilly Media (2015)
OpenCV. http://opencv.org/. Accessed 18 Feb 2023
OpenFace API Documentation. http://cmusatyalab.github.io/openface/. Accessed 18 Feb 2023
Osawa, T., Matsuda, Y., Ohmura, R., Imai, M.: Embodiment of an agent by an-thropomorphization of a common object. Web Intel. Agent Syst. Int. J. 10, 345–358 (2012)
oVice. https://www.ovice.com/. Accessed 12 Feb 2023
Rviz. https://carla.readthedocs.io/projects/ros-bridge/en/latest/rviz_plugin/. Accessed 18 Feb 2023
Schoff, F., Kalenichenko, D., Philbin, J.: FaceNet: a unified embedding for face recognition and clustering. In: IEEE Conference on CVPR 2015, pp. 815–823 (2015)
Society 5.0. https://www.japan.go.jp/abenomics/_userdata/abenomics/pdf/society_5.0.pdf. Accessed 12 Feb 2023
Ubuntu. https://www.ubuntu.com/. Accessed 18 Feb 2023
urdf/XML/Transmission. http://wiki.ros.org/urdf/XML/Transmission. Accessed 12 Feb 2023
Watanabe, T.: Human-entrained embodied interaction and communication technology. In: Fukuda, S. (eds.) Emotional Engineering, pp. 161–177. Springer, London (2011). https://doi.org/10.1007/978-1-84996-423-4_9
Wongphati, M., Matsuda, Y., Osawa, H., Imai, M.: Where do you want to use a robotic arm? And what do you want from the robot? In: International Symposium on Robot and Human Interactive Communication, pp. 322–327 (2012)
Acknowledgement
This work was partly supported by JSPS KAKENHI Grant Numbers JP22K12131, Science and Technology Award 2022 of Okayama Foundation for Science and Technology, Original Research Grant 2022 of Okayama Prefectural University. The author would like to acknowledge Dr. Takashi Oyama, Mr. Hiroki Kimachi, Mr. Shuto Misawa, Mr. Kengo Sadakane, Mr. Tetsuo Kasahara for implementing the basic modules, and all members of Kansei Information Engineering Labs at Okayama Prefectural University for their cooperation to conduct the experiments.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Ito, T., Watanabe, T. (2023). Coordinated Motor Display System of ARM-COMS for Evoking Emotional Projection in Remote Communication. In: Mori, H., Asahi, Y. (eds) Human Interface and the Management of Information. HCII 2023. Lecture Notes in Computer Science, vol 14015. Springer, Cham. https://doi.org/10.1007/978-3-031-35132-7_28
Download citation
DOI: https://doi.org/10.1007/978-3-031-35132-7_28
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-35131-0
Online ISBN: 978-3-031-35132-7
eBook Packages: Computer ScienceComputer Science (R0)