Abstract
Purpose
A virtual reality (VR) system, where surgeons can practice procedures on virtual anatomies, is a scalable and cost-effective alternative to cadaveric training. The fully digitized virtual surgeries can also be used to assess the surgeon’s skills using measurements that are otherwise hard to collect in reality. Thus, we present the Fully Immersive Virtual Reality System (FIVRS) for skull-base surgery, which combines surgical simulation software with a high-fidelity hardware setup.
Methods
FIVRS allows surgeons to follow normal clinical workflows inside the VR environment. FIVRS uses advanced rendering designs and drilling algorithms for realistic bone ablation. A head-mounted display with ergonomics similar to that of surgical microscopes is used to improve immersiveness. Extensive multi-modal data are recorded for post-analysis, including eye gaze, motion, force, and video of the surgery. A user-friendly interface is also designed to ease the learning curve of using FIVRS.
Results
We present results from a user study involving surgeons with various levels of expertise. The preliminary data recorded by FIVRS differentiate between participants with different levels of expertise, promising future research on automatic skill assessment. Furthermore, informal feedback from the study participants about the system’s intuitiveness and immersiveness was positive.
Conclusion
We present FIVRS, a fully immersive VR system for skull-base surgery. FIVRS features a realistic software simulation coupled with modern hardware for improved realism. The system is completely open source and provides feature-rich data in an industry-standard format.
Similar content being viewed by others
References
Scholz M, Parvin R, Thissen J, Löhnert C, Harders A, Blaeser K (2010) Skull base approaches in neurosurgery. Head Neck Oncol 2:1–9
Cousins V (2008) Lateral skull base surgery: a complicated pursuit? J Laryngol Otol 122(3):221–229
Laeeq K, Bhatti NI, Carey JP, Della Santina CC, Limb CJ, Niparko JK, Minor LB, Francis HW (2009) Pilot testing of an assessment tool for competency in mastoidectomy. Laryngoscope 119(12):2402–2410
George AP, De R (2010) Review of temporal bone dissection teaching: how it was, is and will be. J Laryngol Otol 124(2):119–125. https://doi.org/10.1017/S0022215109991617
Kuppersmith RB, Johnston R, Moreau D, Loftin RB, Jenkins H (1997) Building a virtual reality temporal bone dissection simulator. In: Medicine meets virtual reality. IOS Press, pp 180–186
Wiet GJ, Bryan J, Dodson E, Sessanna D, Stredney D, Schmalbrock P, Welling B (2000) Virtual temporal bone dissection simulation. In: Medicine meets virtual reality 2000. IOS Press, pp 378–384
Agus M, Giachetti A, Gobbetti E, Zanetti G, Zorcolo A, John NW, Stone RJ (2002) Mastoidectomy simulation with combined visual and haptic feedback. In: Medicine meets virtual reality 02/10. IOS Press, pp 17–23
Pflesser B, Petersik A, Tiede U, Höhne KH, Leuwer R (2002) Volume cutting for virtual petrous bone surgery. Comput Aided Surg 7(2):74–83
He X, Chen Y (2006) Bone drilling simulation based on six degree-of-freedom haptic rendering. In: Proceedings of EuroHaptics, pp 147–152
Sorensen MS, Mosegaard J, Trier P (2009) The visible ear simulator: a public PC application for GPU-accelerated haptic 3D simulation of ear surgery based on the visible ear data. Otol Neurotol 30(4):484–487
Wong D, Unger B, Kraut J, Pisa J, Rhodes C, Hochman JB (2014) Comparison of cadaveric and isomorphic virtual haptic simulation in temporal bone training. J Otolaryngol Head Neck Surg 43:1–6
Chan S, Li P, Locketz G, Salisbury K, Blevins NH (2016) High-fidelity haptic and visual rendering for patient-specific simulation of temporal bone surgery. Comput Assist Surg 21(1):85–101
Sieber DM, Andersen SAW, Sørensen MS, Mikkelsen PT (2021) Openear image data enables case variation in high fidelity virtual reality ear surgery. Otol Neurotol 42(8):1245–1252
Munawar A, Wang Y, Gondokaryono R, Fischer GS (2019) A real-time dynamic simulator and an associated front-end representation format for simulating complex robots and environments. In: 2019 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 1875–1882
Munawar A, Li Z, Kunjam P, Nagururu N, Ding AS, Kazanzides P, Looi T, Creighton FX, Taylor RH, Unberath M (2022) Virtual reality for synergistic surgical training and data generation. Comput Methods Biomech Biomed Eng Imaging Vis 10(4):366–374
Ding AS, Lu A, Li Z, Galaiya D, Siewerdsen JH, Taylor RH, Creighton FX (2021) Automated registration-based temporal bone computed tomography segmentation for applications in neurotologic surgery. Otolaryngol Head Neck Surg 167(1):133–140
You C, Zhou Y, Zhao R, Staib L, Duncan JS (2022) Simcvd: simple contrastive voxel-wise representation distillation for semi-supervised medical image segmentation. IEEE Trans Med Imaging 41(9):2228–2237
Li Z, Liu X, Drenkow N, Ding A, Creighton FX, Taylor RH, Unberath, M (2021) Revisiting stereo depth estimation from a sequence-to-sequence perspective with transformers. In: Proceedings of the IEEE/CVF international conference on computer vision, pp 6197–6206
Li Z, Ye W, Wang D, Creighton FX, Taylor RH, Venkatesh G, Unberath M (2023) Temporally consistent online depth estimation in dynamic scenes. In: Proceedings of the IEEE/CVF winter conference on applications of computer vision, pp 3018–3027
Li Z, Shu H, Liang R, Goodridge A, Sahu M, Creighton FX, Taylor RH, Unberath M (2022) Tatoo: vision-based joint tracking of anatomy and tool for skull-base surgery. arXiv preprint arXiv:2212.14131
Shvets AA, Rakhlin A, Kalinin AA, Iglovikov VI Automatic instrument segmentation in robot-assisted surgery using deep learning
Shu H, Liang R, Li Z, Goodridge A, Zhang X, Ding H, Nagururu N, Sahu M, Creighton FX, Taylor RH, et al (2022) Twin-s: a digital twin for skull-base surgery. arXiv preprint arXiv:2211.11863
Ishida H, Barragan JA, Munawar A, Li Z, Kazanzides P, Kazhdan M, Trakimas D, Creighton FX, Taylor RH (2023) Improving surgical situational awareness with signed distance field: a pilot study in virtual reality
Acknowledgements
This work was supported by NSF OISE-1927354 and OISE-1927275, NIDCD K08 Grant DC019708, a grant from Galen Robotics, and an agreement between Johns Hopkins University and the Multi-Scale Medical Robotics Centre, Ltd.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
Russell Taylor and JHU may be entitled to royalty payments related to technology discussed in this paper, and Dr. Taylor has received or may receive some portion of these royalties. Also, Dr. Taylor is a paid consultant to and owns equity in Galen Robotics, Inc. These arrangements have been reviewed and approved by JHU in accordance with its conflict of interest policy.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Below is the link to the electronic supplementary material.
Supplementary file 1 (mp4 16176 KB)
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Munawar, A., Li, Z., Nagururu, N. et al. Fully immersive virtual reality for skull-base surgery: surgical training and beyond. Int J CARS 19, 51–59 (2024). https://doi.org/10.1007/s11548-023-02956-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11548-023-02956-5