Abstract
Purpose
The growing number of technical systems in the operating room has increased attention on developing touchless interaction methods for sterile conditions. However, touchless interaction paradigms lack the tactile feedback found in common input devices such as mice and keyboards. We propose a novel touchless eye-tracking interaction system with auditory display as a feedback method for completing typical operating room tasks. Auditory display provides feedback concerning the selected input into the eye-tracking system as well as a confirmation of the system response.
Methods
An eye-tracking system with a novel auditory display using both earcons and parameter-mapping sonification was developed to allow touchless interaction for six typical scrub nurse tasks. An evaluation with novice participants compared auditory display with visual display with respect to reaction time and a series of subjective measures.
Results
When using auditory display to substitute for the lost tactile feedback during eye-tracking interaction, participants exhibit reduced reaction time compared to using visual-only display. In addition, the auditory feedback led to lower subjective workload and higher usefulness and system acceptance ratings.
Conclusion
Due to the absence of tactile feedback for eye-tracking and other touchless interaction methods, auditory display is shown to be a useful and necessary addition to new interaction concepts for the sterile operating room, reducing reaction times while improving subjective measures, including usefulness, user satisfaction, and cognitive workload.
Similar content being viewed by others
References
Rutala W, White M, Gergen M, Weber D (2006) Bacterial contamination of keyboards: efficacy and functional impact of disinfectants. Infect Control 27(4):372–377. https://doi.org/10.1086/503340
Stockert E, Langerman A (2014) Assessing the magnitude and costs of intraoperative inefficiencies attributable to surgical instrument trays. J Am Coll Surg 219(4):646–55. https://doi.org/10.1016/j.jamcollsurg.2014.06.019
Fitzke T, Krail N, Kroll F, Ohlrogge L, Schrder F, Spillner L, Voll A, Dylla F, Herrlich M, Malaka R (2015) Fubasierte interaktionen mit computersystemen im operationssaal. In: 14. Jahrestagung der Gesellschaft für Computer-und Roboterassistierte Chirurgie e.V. CURAC, Bremen, September 2015
Hatscher B, Luz M, Hansen C (2017) Foot interaction concepts to support radiological interventions. In: Mensch und Computer, Regensburg
Bizzotto N, Costanzo A, Bizzotto L, Regis D, Sandri A, Magnan B (2014) Leap motion gesture control with osirix in the operating room to control imaging. Surg Innov 21(6):655–656. https://doi.org/10.1177/1553350614528384
Black D, Ganze S, Hettig J, Hansen C (2017) Auditory display for improving free-hand gesture interaction. In: Mensch und Computer, Regensburg
Kirmizibayrak C, Radeva N, Wakid M, Philbeck J, Sibert J, Hahn J (2011) Evaluation of gesture based interfaces for medical volume visualization tasks. In: Proceedings of the 10th international conference on virtual reality. https://doi.org/10.1145/2087756.2087764
Mewes A, Hensen B, Wacker F, Hansen C (2016) Touchless interaction with software in interventional radiology and surgery: a systematic literature review. Int J CARS 11(1):1–16. https://doi.org/10.1007/s11548-016-1480-6
Chetwood A, Kwok K, Sun L, Mylonas G, Clark J, Darzi A, Yang G (2012) Collaborative eye tracking: a potential training tool in laparoscopic surgery. Surg Endosc 26(7):2003–2009. https://doi.org/10.1007/s00464-011-2143-x
Ali S, Reisner L, King B, Cao A, Auner G, Klein M, Pandya A (2007) Eye gaze tracking for endoscopic camera positioning: an application of a hardware/software interface developed to automate Aesop. Stud Health Technol Inform 132:47 PMID:18391246
Glaser B, Unger M, Schellenberg T, Neumuth T (2015) Use cases für sterile blickgesteuerte Mensch-Maschine-interaktionskonzepte im digitalen operationssaal. In: 14. Jahrestagung der Gesellschaft für Computer- und Roboterassistierte Chirurgie e.V. - CURAC, Bremen, September 2015
Kangas J, Akkil D, Rantala J, Isokoski P, Majaranta P, Raisamo R (2014) Gaze gestures and haptic feedback in mobile devices. In: SIGCHI conference on human factors in computing systems, Toronto, April 2014. https://doi.org/10.1145/2556288.2557040
Park Y, Kim J, Lee K (2015) Effects of auditory feedback on menu selection in hand-gesture interfaces. IEEE Multimed 22(1):32–40. https://doi.org/10.1109/MMUL.2015.5
Black D, Hansen C, Nabavi A, Kikinis R, Hahn H (2017) A survey of auditory display in image-guided interventions. Int J CARS. https://doi.org/10.1007/s11548-017-1547-z
Puckette M (1996) Pure data: another integrated computer music environment. In: Second intercollege computer music concerts. Tachikawa, Japan, pp 37–41
Hermann T (2008) Taxonomy and definitions for sonification and auditory display. In: 14th international conference on auditory display, Paris, France, 2008
Van der Laan J, Heino A, de Waard D (1997) A simple procedure for the assessment of acceptance of advanced transport telematics. Transp Res Part C Emerg Technol 5:1–10. https://doi.org/10.1016/S0968-090X(96)00025-3
Byers J, Bittner A, Hill S (1989) Traditional and raw task load index (TLX) correlations: are paired comparisons necessary? In: Mital A (ed) Advances in industrial ergonomics and safety. Taylor and Francis, Routledge, pp 481–485
Hart SG (2006) NASA-task load index (NASA-TLX); 20 years Later. In: Human factors and ergonomics society 50th annual meeting. HFES, Santa Monica, pp 904–908
Jain A, Bansal R, Kumar A, Singh K (2015) A comparative study of visual and auditory reaction times on the basis of gender and physical activity levels of medical first year students. Int J Appl Basic Med Res 5(2):124–127. https://doi.org/10.4103/2229-516X.157168
Field A (2013) Discovering statistics using IBM SPSS statistics. Sage Publishing, Thousand Oaks
Bork F, Fuerst B, Schneider A, Pinto F, Graumann C, Navab N (2015) Auditory and visio-temporal distance coding for 3-dimensional perception in medical augmented reality. In: Proceedings of 2015 IEEE international symposium on mixed and augmented reality (ISMAR), pp 7–12. https://doi.org/10.1109/ISMAR.2015.16
Cho B, Oka M, Matsumoto N, Ouchida R, Hong J, Hashizume M (2013) Warning navigation system using realtime safe region monitoring for otologic surgery. Int J CARS 8:395–405. https://doi.org/10.1007/s11548-012-0797-z
Hansen C, Black D, Lange C, Rieber F, Lamadé W, Donati M, Oldhafer K, Hahn H (2013) Auditory support for resection guidance in navigated liver surgery. Med Robot Comput Assist Surg 9(1):36. https://doi.org/10.1002/rcs.1466
Kitagawa M, Dokko D, Okamura A, Yuh D (2005) Effect of sensory substitution on suture-manipulation forces for robotic surgical systems. Thorac Cardiovasc Surg 129(1):151–8. https://doi.org/10.1016/j.jtcvs.2004.05.029
Willems P, Noordmans H, van Overbeeke J, Viergever M, Tulleken C, van der Sprenkel J (2005) The impact of auditory feedback on neuronavigation. Acta Neurochir 147:167–173. https://doi.org/10.1007/s00701-004-0412-3
Katz J (2014) Noise in the operating room. Anesthesiology 121(4):894–8. https://doi.org/10.1097/ALN.0000000000000319
Moorthy K, Munz Y, Undre S, Darzi A (2004) Objective evaluation of the effect of noise on the performance of a complex laparoscopic task. Surgery 136(1):25–30. https://doi.org/10.1016/j.surg.2003.12.011 (Discussion 31)
Rockstroh M, Franke S, Hofer M, Will A, Kasparick M, Andersen B, Neumuth T (2017) OR.NET: multi-perspective qualitative evaluation of an integrated operating room based on IEEE 11073 SDC. Int J CARS 12:1461–1469
Blattner M, Sumikawa D, Greenberg R (1989) Earcons and icons: their structure and common design principles. Hum Comput Interact 4(1):11–44. https://doi.org/10.1207/s15327051hci0401_1
Acknowledgements
The study was partly supported by National Institutes of Health Grants P41 EB015902, P41 EB015898, R01EB014955, and U24CA180918.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors state that they have no conflict of interest.
Ethical approval
For this kind of study no formal ethics approval is required by the institutional ethics committee.
Informed consent
Informed consent was obtained from all individual participants included in the study.
Rights and permissions
About this article
Cite this article
Black, D., Unger, M., Fischer, N. et al. Auditory display as feedback for a novel eye-tracking system for sterile operating room interaction. Int J CARS 13, 37–45 (2018). https://doi.org/10.1007/s11548-017-1677-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11548-017-1677-3