Abstract
We developed a Virtual Reality (VR) based telepresence system providing novel immersive experience for remote conversation and collaboration. By wearing VR headsets, all the participants can be gathered into a same virtual space, with 3D cartoon Avatars representing them. The 3D VR Avatars can realistically emulate the head postures, facial expressions and hand motions of the participants, enabling them to conduct enjoyable group-to-group conversations with people spatially isolated from them. Moreover, our VR telepresence system offers conspicuously new manners for remote collaboration. For example, users can play PPT slides or watch videos together, or they can cooperate on solving a math problem by calculating on a virtual blackboard, all of which can be hardly achieved using conventional video-based telepresence system. Experiments show that our system can provide unprecedented immersive experience for tele-conversation and new possibilities for remote collaboration.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Otsuka, K.: MMSpace: kinetically-augmented telepresence for small group-to-group conversations. In: Virtual Reality (VR) 2016 IEEE, pp. 19–28. IEEE (2016)
Maimone, A., Fuchs, H.: Encumbrance-free telepresence system with real-time 3D capture and display using commodity depth cameras. In: International symposium on mixed and augmented reality (2011)
Zhang, C., Cai, Q., Chou, P.A., Zhang, Z., Martin-Brualla, R.: Viewport: a distributed, immersive teleconferencing system with infrared dot pattern. IEEE Multimedia 20(1), 17–27 (2013)
Zhu, Z., Martin, R.R., Pepperell, R., Burleigh, A.: 3D modeling and motion parallax for improved videoconferencing. Comput. Visual Media 2(2), 131–142 (2016)
Fairchild, A.J., Campion, S.P., GarcÃa, A.S., Wolff, R., Fernando, T., Roberts, D.J.: A mixed reality telepresence system for collaborative space operation. IEEE Trans. Circuits Syst. Video Technol. 27(4), 814–827 (2017)
Vasudevan, R., Zhou, Z., Kurillo, G., Lobaton, E., Bajcsy, R., Nahrstedt, K.: Real-time stereo-vision system for 3D teleimmersive collaboration. In: International Conference on Multimedia and Expo (2010)
Higuchi, K., Chen, Y., Chou, P. A., Zhang, Z., Liu, Z.: Immerseboard: immersive telepresence experience using a digital whiteboard. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 2383–2392. ACM (2015)
Ichim, A., Bouaziz, S., Pauly, M.: Dynamic 3D avatar creation from hand-held video input. Int. Conf. Comput. Graph. Interact. Tech. 34(4), 45:1–45:14 (2015)
Cao, C., Weng, Y., Lin, S., Zhou, K.: 3D shape regression for real-time facial animation. ACM Trans. Graph. 32(4), 41:1–41:10 (2013)
Edwards, P., Landreth, C., Fiume, E., Singh, K.: JALI: an animator-centric viseme model for expressive lip synchronization. ACM Trans. Graph. (TOG) 35(4), 127 (2016)
Gao, Z., Yu, Y., Zhou, Y., Du, S.: Leveraging two kinect sensors for accurate full-body motion capture. Sensors 15(9), 24297–24317 (2015)
Fang, B., Sun, F., Liu, H., Guo, D.: A novel data glove using inertial and magnetic sensors for motion capture and robotic arm-hand teleoperation. Ind. Robot Int. J. 44(2), 155–165 (2017)
Bradski, G.: Opencv Libr. Doct. Dobbs J. 25(11), 120–126 (2000)
Xiong, X., De la Torre, F.: Supervised descent method and its applications to face alignment. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 532–539 (2013)
Xie, N., Yuan, T., Chen, N., Zhou, X., Wang, Y., Zhang, X.: Rapid DCT-based LipSync generation algorithm for game making. In: SIGGRAPH ASIA 2016 Posters, p. 2. ACM (2016)
Hoon, L., Chai, W., Rahman, K.: Development of real-time lip sync animation framework based on viseme human speech. Arch. Des. Res. 27(4), 19–29 (2014)
Aristidou, A., Lasenby, J.: FABRIK: a fast, iterative solver for the inverse kinematics problem. Graph. Models 73(5), 243–260 (2011)
Acknowledgements
This work was supported by Research Grant of Beijing Higher Institution Engineering Research Center and the People Programme (Marie Curie Actions) of the European Union’s Seventh Framework Programme (MC-IRSES, grant No. 612627).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Tan, Z., Hu, Y., Xu, K. (2017). Virtual Reality Based Immersive Telepresence System for Remote Conversation and Collaboration. In: Chang, J., Zhang, J., Magnenat Thalmann, N., Hu, SM., Tong, R., Wang, W. (eds) Next Generation Computer Animation Techniques. AniNex 2017. Lecture Notes in Computer Science(), vol 10582. Springer, Cham. https://doi.org/10.1007/978-3-319-69487-0_17
Download citation
DOI: https://doi.org/10.1007/978-3-319-69487-0_17
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-69486-3
Online ISBN: 978-3-319-69487-0
eBook Packages: Computer ScienceComputer Science (R0)