Abstract
We have been developing a surgical support system for Robot-Assisted Laparoscopic Partial Nephrectomy (RAPN) using augmented reality (AR) technology since April 2014. In our system, three-dimensional computer graphics (3DCG) models including kidneys, arteries, veins, tumors, and urinary tracts are generated from tomographic images (DICOM) preoperatively. The 3DCG models are superimposed on the endoscopic images and projected onto the operator’s console and the operating room monitor. The position and orientation of the 3DCG models are automatically controlled in real-time according to the movement of the endoscope camera image, and the display position and transparency of the 3DCG models can be changed manually by the assistant if necessary. Now, we are developing a VR system that can manually control the 3DCG intuitively. In this paper, we describe the details of this system.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Koeda, M., et al.: Image overlay support with 3DCG organ model for robot-assisted laparoscopic partial nephrectomy. In: Stephanidis, C. (ed.) HCI 2016. CCIS, vol. 617, pp. 508–513. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-40548-3_84
Sengiku, A., et al.: Augmented Reality Navigation System for Robot-Assisted Laparoscopic Partial Nephrectomy. Springer International Publishing, Design, User Experience, and Usability: Designing Pleasurable Experiences, Part II. Lecture Notes in Computer Science, vol. 10289, pp. 575–584, 2017. https://doi.org/10.1007/978-3-319-58637-3_45
Sawada, A., Hamada, A., Sengiku, A., Koeda, M., Onishi, K., Ogawa, O.: The development of a 3D navigation system for robot-assisted partial nephrectomy using augmented reality technology. In: Proceedings of The 2019 Annual EAU Congress, vol. 18(1) (2019)
Hamada, A., et al.: The Current Status and Challenges in Augmented Reality Navigation System for Robot-Assisted Laparoscopic Partial Nephrectomy. Springer, Cham, Human-Computer Interaction. Design and User Experience. HCII2020. Lecture Notes in Computer Science, vol. 12182, pp. 620–629 (2020). https://doi.org/10.1007/978-3-319-58637-3_45
Unity Real-Time Development Platform|3D, 2D VR & AR Engine https://unity.com/
OpenCV for Unity|Integration|Unity Asset Store https://assetstore.unity.com/packages/tools/integration/opencv-for-unity-21088
Oculus Integration|Integration|Unity Asset Store https://assetstore.unity.com/packages/tools/integration/oculus-integration-82022
keijiro/Pcx: Point cloud importer & renderer for Unity https://github.com/keijiro/Pcx
fixstars/libSGM: Stereo Semi Global Matching by cuda https://github.com/fixstars/libSGM
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Koeda, M., Hamada, A., Sawada, A., Onishi, K., Noborio, H., Ogawa, O. (2021). VR-Based Surgery Navigation System with 3D User Interface for Robot-Assisted Laparoscopic Partial Nephrectomy. In: Kurosu, M. (eds) Human-Computer Interaction. Interaction Techniques and Novel Applications. HCII 2021. Lecture Notes in Computer Science(), vol 12763. Springer, Cham. https://doi.org/10.1007/978-3-030-78465-2_39
Download citation
DOI: https://doi.org/10.1007/978-3-030-78465-2_39
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-78464-5
Online ISBN: 978-3-030-78465-2
eBook Packages: Computer ScienceComputer Science (R0)