Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

A Framework for Multimodal Medical Image Interaction

Published: 01 November 2024 Publication History

Abstract

Medical doctors rely on images of the human anatomy, such as magnetic resonance imaging (MRI), to localize regions of interest in the patient during diagnosis and treatment. Despite advances in medical imaging technology, the information conveyance remains unimodal. This visual representation fails to capture the complexity of the real, multisensory interaction with human tissue. However, perceiving multimodal information about the patient's anatomy and disease in real-time is critical for the success of medical procedures and patient outcome. We introduce a Multimodal Medical Image Interaction (MMII) framework to allow medical experts a dynamic, audiovisual interaction with human tissue in three-dimensional space. In a virtual reality environment, the user receives physically informed audiovisual feedback to improve the spatial perception of anatomical structures. MMII uses a model-based sonification approach to generate sounds derived from the geometry and physical properties of tissue, thereby eliminating the need for hand-crafted sound design. Two user studies involving 34 general and nine clinical experts were conducted to evaluate the proposed interaction framework's learnability, usability, and accuracy. Our results showed excellent learnability of audiovisual correspondence as the rate of correct associations significantly improved ($p < 0.001$) over the course of the study. MMII resulted in superior brain tumor localization accuracy ($p < 0.05$) compared to conventional medical image interaction. Our findings substantiate the potential of this novel framework to enhance interaction with medical images, for example, during surgical procedures where immediate and precise feedback is needed.

References

[1]
A. Ahmad, S. G. Adie, M. Wang, and S. A. Boppart. Sonification of optical coherence tomography data and images. Optics Express, 18(10): pp. 9934–9944, 2010. 3.
[2]
S. Andress, A. Johnson, M. Unberath, A. F. Winkler, K. Yu, J. Fotouhi, S. Weidert, G. Osgood, and N. Navab. On-the-fly augmented reality for orthopedic surgery using a multimodal fiducial. Journal of Medical Imaging, 5(2): pp. 021209–021209, 2018. 2.
[3]
R. D. Bartlett, D. Choi, and J. B. Phillips. Biomechanical properties of the spinal cord: implications for tissue engineering and clinical translation. Regenerative medicine, 11(7): pp. 659–673, 2016. 4.
[4]
R. D. Bartlett, D. Eleftheriadou, R. Evans, D. Choi, and J. B. Phillips. Mechanical properties of the spinal cord and brain: Comparison with clinical-grade biomaterials for tissue engineering and regenerative medicine. Bio-materials, 258: p. 120303, 2020. 4.
[5]
A. Ben Awadh, J. Clark, G. Clowry, and I. D. Keenan. Multimodal three-dimensional visualization enhances novice learner interpretation of basic cross-sectional anatomy. Anatomical sciences education, 15(1): pp. 127–142, 2022. 2.
[6]
T. Blum, V. Kleeberger, C. Bichlmeier, and N. Navab. mirracle: An augmented reality magic mirror system for anatomy education. In 2012 IEEE Virtual Reality Workshops (VRW), pp. 115–116. IEEE, 2012. 2.
[7]
K. Bogomolova, I. J. van der Ham, M. E. Dankbaar, W. W. van den Broek, S. E. Hovius, J. A. van der Hage, and B. P. Hierck. The effect of stereoscopic augmented reality visualization on learning anatomy and the modifying effect of visual-spatial abilities: A double-center randomized controlled trial. Anatomical sciences education, 13(5): pp. 558–567, 2020. 2.
[8]
F. Bork, R. Barmaki, U. Eck, P. Fallavolita, B. Fuerst, and N. Navab. “Exploring non-reversing magic mirrors for screen-based augmented reality systems”. In 2017 IEEE virtual reality (VR), pp. 373–374. IEEE, 2017. 2.
[9]
F. Bork, B. Fuers, A.-K. Schneider, F. Pinto, C. Graumann, and N. Navab. Auditory and visio-temporal distance coding for 3-dimensional perception in medical augmented reality. In 2015 IEEE International Symposium on Mixed and Augmented Reality, pp. 7–12. IEEE, New York, NY, USA, 2015. 2.
[10]
F. Bork, L. Stratmann, S. Enssle, U. Eck, N. Navab, J. Waschke, and D. Kugelmann. The benefits of an augmented reality magic mirror system for integrated radiology teaching in gross anatomy. Anatomical sciences education, 12(6): pp. 585–598, 2019. 2.
[11]
T. Bovermann, T. Hermann, and H. Ritter. Tangible data scanning sonification model. In Proceedings of the 12th International Conference on Auditory Display, 2006. 2.
[12]
C. Chen, M. Yarmand, V. Singh, M. V. Sherer, J. D. Murphy, Y. Zhang, and N. Weibel. Vrcontour: Bringing contour delineations of medical structures into virtual reality. In 2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 64–73, 2022. 2, 9.
[13]
P. R. Cook. Physically informed sonic modeling (phism): Percussive synthesis. In Proceedings of the 1996 International Computer Music Conference, pp. 228–231. The International Computer Music Association, 1996. 3.
[14]
C. Dennler, L. Jaberg, J. Spirig, C. Agten, T. Götschi, P. Fürnstahl, and M. Farshad. Augmented reality-based navigation increases precision of pedicle screw insertion. Journal of orthopaedic surgery and research, 15: pp. 1–8, 2020. 2.
[15]
B. J. Dixon, M. J. Daly, H. H. Chan, A. Vescan, I. J. Witterick, and J. C. Irish. Inattentional blindness increased with augmented reality surgical navigation. American journal of rhinology & allergy, 28(5): pp. 433–437, 2014. 2.
[16]
G. Dubus and R. Bresin. A systematic review of mapping strategies for the sonification of physical quantities. PloS one, 8(12): p. e82491, 2013. 3.
[17]
G. Eckel. Sound synthesis by physical modelling with modalys. in Proc. ISMA'95, pp. 478–482, 1995. 3, 5.
[18]
K. Franinovic and S. Serafin. Sonic interaction design. Mit Press, 2013. 2.
[19]
C. Hansen, D. Black, C. Lange, F. Rieber, W. Lamadé, M. Donati, K. J. Oldhafer, and H. K. Hahn. Auditory support for resection guidance in navigated liver surgery. The International Journal of Medical Robotics and Computer Assisted Surgery, 9(1): pp. 36–43, 2013. 3.
[20]
S. G. Hart. Nasa-task load index (nasa-tlx); 20 years later. In Proceedings of the human factors and ergonomics society annual meeting, vol. 50, pp. 904–908. Sage publications Sage CA: Los Angeles, CA, 2006. 6.
[21]
S. G. Hart and L. E. Staveland. “Development of nasa-tlx (task load index): Results of empirical and theoretical research”. In Advances in Psychology, vol. 52, pp. 139–183. Amsterdam, Netherlands: Elsevier, 1988. 6.
[22]
T. Hermann, A. Hunt, J. G. Neuhoff et al., The sonification handbook, vol. 1. Logos Verlag Berlin, 2011. 2.
[23]
T. Hermann and H. Ritter. Listen to your data: Model-based sonification for data analysis. Advances in intelligent computing and multimedia systems, 8: pp. 189–194, 1999. 2.
[24]
A. Illanes, A. Boese, I. Maldonado, A. Pashazadeh, A. Schaufler, N. Navab, and M. Friebe. Novel clinical device tracking and tissue event characterization using proximally placed audio signal acquisition and processing. Scientific reports, 8(1): p. 12070, 2018. 9.
[25]
S. Jang, J. M. Vitale, R. W. Jyung, and J. B. Black. Direct manipulation is better than passive viewing for learning anatomy in a three-dimensional virtual reality environment. Computers & Education, 106: pp. 150–165, 2017. 2.
[26]
F. Joeres, D. Black, S. Razavizadeh, and C. Hansen. Audiovisual AR concepts for laparoscopic subsurface structure navigation. In Graphics Interface 2021, 2021. 2.
[27]
W. Kang, L. Wang, and Y. Fan. Viscoelastic response of gray matter and white matter brain tissues under creep and relaxation. Journal of Biomechanics, 162: p. 111888, 2024. 4.
[28]
P. R. Kantan, S. Dahl, and E. G. Spaich. Sound-guided 2-d navigation: Effects of information concurrency and coordinate system. In Nordic Human-Computer Interaction Conference, pp. 1–11, 2022. 3.
[29]
I. D. Keenan and M. Powell. Interdimensional Travel: Visualisation of 3D-2D Transitions in Anatomy Learning, pp. 103–116. Cham: Springer International Publishing, 2020. 2.
[30]
M. E. Madden. Introduction to sectional anatomy. Lippincott Williams & Wilkins, 2008. 2.
[31]
A. Marquardt, C. Trepkowski, T. D. Eibich, J. Maiero, E. Kruijff, and J. Schöning. Comparing non-visual and visual guidance methods for narrow field of view augmented reality displays. IEEE Transactions on Visualization and Computer Graphics, 26(12): pp. 3389–3401, 2020. 2.
[32]
S. Matinfar, T. Hermann, M. Seibold, P. Fürnstahl, M. Farshad, and N. Navab. Sonification for process monitoring in highly sensitive surgical tasks. In Proceedings of the Nordic Sound and Music Computing Conference 2019 (Nordic SMC 2019), 2019. 3.
[33]
S. Matinfar, M. A. Nasseri, U. Eck, H. Roodaki, N. Navab, C. P. Lohmann, M. Maier, and N. Navab. Surgical soundtracks: Towards automatic musical augmentation of surgical procedures. In Medical Image Computing and Computer-Assisted Intervention- MICCAI 2017: 20th International Conference, Quebec City, QC, Canada, September 11–13, 2017, Proceedings, Part II 20, pp. 673–681. Springer, 2017. 3.
[34]
S. Matinfar, M. Salehi, S. Dehghani, and N. Navab. From tissue to sound: Model-based sonification of medical imaging. In International Conference on Medical Image Computing and Computer-Assisted Intervention, pp. 207–216. Springer, 2023. 3, 4, 9.
[35]
S. Matinfar, M. Salehi, D. Suter, M. Seibold, S. Dehghani, N. Navab, F. Wanivenhaus, P. Fürnstahl, M. Farshad, and N. Navab. Sonification as a reliable alternative to conventional visual surgical navigation. Scientific Reports, 13(1): p. 5930, 2023. 3.
[36]
N. Navab, A. Martin-Gomez, M. Seibold, M. Sommersperger, T. Song, A. Winkler, K. Yu, and U. Eck. Medical augmented reality: Definition, principle components, domain modeling, and design-development-validation process. Journal of Imaging, 9(1), 2023. 2.
[37]
M. K. Ngo and C. Spence. Auditory, tactile, and multisensory cues facilitate search for dynamic visual stimuli. Attention, Perception, & Psychophysics, 72(6): pp. 1654–1665, 2010. 2.
[38]
D. Ostler, M. Seibold, J. Fuchtmann, N. Samm, H. Feussner, D. Wilhelm, and N. Navab. Acoustic signal analysis of instrument–tissue interaction for minimally invasive interventions. International Journal of Computer Assisted Radiology and Surgery, 15: pp. 771–779, 2020. 9.
[39]
G. Parseihian, C. Gondre, M. Aramaki, S. Ystad, and R. Kronland-Martinet. Comparison and evaluation of sonification strategies for guidance tasks. IEEE Transactions on Multimedia, 18(4): pp. 674–686, 2016. 3.
[40]
R. Porter. The Cambridge illustrated history of medicine. Cambridge University Press, 2001. 5.
[41]
H. Roodaki, N. Navab, A. Eslami, C. Stapleton, and N. Navab. Sonif-eye: Sonification of visual information using physical modeling sound synthesis. IEEE transactions on Visualization and Computer Graphics, 23(11): pp. 2366–2371, 2017. 3.
[42]
L. Schütz, T. El Chemaly, E. Weber, A. T. Doan, J. Tsai, C. Leuze, B. Daniel, and N. Navab. Interactive shape sonification for tumor localization in breast cancer surgery. In Proceedings of the CHI Conference on Human Factors in Computing Systems, CHI '24. Association for Computing Machinery, New York, NY, USA, 2024. 3.
[43]
L. Schütz, E. Weber, W. Niu, B. Daniel, J. McNab, N. Navab, and C. Leuze. Audiovisual augmentation for coil positioning in transcranial magnetic stimulation. Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization, 11(4): pp. 1158–1165, 2023. 2.
[44]
L. Shams and A. R. Seitz. Benefits of multisensory learning. Trends in cognitive sciences, 12(11): pp. 411–417, 2008. 2.
[45]
A. A. Taha and A. Hanbury. Metrics for evaluating 3d medical image segmentation: analysis, selection, and tool. BMC medical imaging, 15: pp. 1–28, 2015. 8.
[46]
A. Väljamäe, T. Steffert, S. Holland, X. Marimon, R. Benitez, S. Mealla, A. Oliveira, and S. Jordà. A review of real-time eeg sonification research. In International Conference on Auditory Display 2013 (ICAD 2 013), 2013. 3.
[47]
E. Van der Burg, C. N. Olivers, A. W. Bronkhorst, and J. Theeuwes. Pip and pop: nonspatial auditory signals improve spatial visual search. Journal of Experimental Psychology: Human Perception and Performance, 34(5): p. 1053, 2008. 2.
[48]
L. Van Hese, S. De Vleeschouwer, T. Theys, S. Rex, R. M. Heeren, and E. Cuypers. The diagnostic accuracy of intraoperative differentiation and delineation techniques in brain tumours. Discover Oncology, 13(1): p. 123, 2022. 9.
[49]
C. M. Wegner and D. B. Karron. “Surgical navigation using audio feedback”. In Medicine Meets Virtual Reality, pp. 450–458. IOS Press, 1997. 3.
[50]
M. Wright. Open sound control: an enabling technology for musical networking. Organised Sound, 10(3): pp. 193–200, 2005. 6.
[51]
J. Yang, A. Barde, and M. Billinghurst. Audio augmented reality: a systematic review of technologies, applications, and future research directions. journal of the audio engineering society, 70(10): pp. 788–809, 2022. 2.
[52]
T. Ziemer. Three-dimensional sonification as a surgical guidance tool. Journal on Multimodal User Interfaces, 17: pp. 253–262, 2023. 3.
[53]
T. Ziemer, D. Black, and H. Schultheis. Psychoacoustic sonification design for navigation in surgical interventions. In Proceedings of Meetings on Acoustics, vol. 30. AIP Publishing, 2017. 3.
[54]
T. Ziemer and H. Schultheis. Three orthogonal dimensions for psychoacoustic sonification. arXiv preprint arXiv:, 2019. 3.
[55]
T. Ziemer, H. Schultheis, D. Black, and R. Kikinis. Psychoacoustical interactive sonification for short range navigation. Acta Acustica united with Acustica, 104(6): pp. 1075–1093, 2018. 3.
[56]
E. Zwicker and H. Fastl. Psychoacoustics: Facts and models, vol. 22. Springer Science & Business Media, 2013. 3.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image IEEE Transactions on Visualization and Computer Graphics
IEEE Transactions on Visualization and Computer Graphics  Volume 30, Issue 11
Nov. 2024
476 pages

Publisher

IEEE Educational Activities Department

United States

Publication History

Published: 01 November 2024

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media