Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Auditory display as feedback for a novel eye-tracking system for sterile operating room interaction

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

The growing number of technical systems in the operating room has increased attention on developing touchless interaction methods for sterile conditions. However, touchless interaction paradigms lack the tactile feedback found in common input devices such as mice and keyboards. We propose a novel touchless eye-tracking interaction system with auditory display as a feedback method for completing typical operating room tasks. Auditory display provides feedback concerning the selected input into the eye-tracking system as well as a confirmation of the system response.

Methods

An eye-tracking system with a novel auditory display using both earcons and parameter-mapping sonification was developed to allow touchless interaction for six typical scrub nurse tasks. An evaluation with novice participants compared auditory display with visual display with respect to reaction time and a series of subjective measures.

Results

When using auditory display to substitute for the lost tactile feedback during eye-tracking interaction, participants exhibit reduced reaction time compared to using visual-only display. In addition, the auditory feedback led to lower subjective workload and higher usefulness and system acceptance ratings.

Conclusion

Due to the absence of tactile feedback for eye-tracking and other touchless interaction methods, auditory display is shown to be a useful and necessary addition to new interaction concepts for the sterile operating room, reducing reaction times while improving subjective measures, including usefulness, user satisfaction, and cognitive workload.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Rutala W, White M, Gergen M, Weber D (2006) Bacterial contamination of keyboards: efficacy and functional impact of disinfectants. Infect Control 27(4):372–377. https://doi.org/10.1086/503340

    Google Scholar 

  2. Stockert E, Langerman A (2014) Assessing the magnitude and costs of intraoperative inefficiencies attributable to surgical instrument trays. J Am Coll Surg 219(4):646–55. https://doi.org/10.1016/j.jamcollsurg.2014.06.019

    Article  PubMed  Google Scholar 

  3. Fitzke T, Krail N, Kroll F, Ohlrogge L, Schrder F, Spillner L, Voll A, Dylla F, Herrlich M, Malaka R (2015) Fubasierte interaktionen mit computersystemen im operationssaal. In: 14. Jahrestagung der Gesellschaft für Computer-und Roboterassistierte Chirurgie e.V. CURAC, Bremen, September 2015

  4. Hatscher B, Luz M, Hansen C (2017) Foot interaction concepts to support radiological interventions. In: Mensch und Computer, Regensburg

  5. Bizzotto N, Costanzo A, Bizzotto L, Regis D, Sandri A, Magnan B (2014) Leap motion gesture control with osirix in the operating room to control imaging. Surg Innov 21(6):655–656. https://doi.org/10.1177/1553350614528384

    Article  PubMed  Google Scholar 

  6. Black D, Ganze S, Hettig J, Hansen C (2017) Auditory display for improving free-hand gesture interaction. In: Mensch und Computer, Regensburg

  7. Kirmizibayrak C, Radeva N, Wakid M, Philbeck J, Sibert J, Hahn J (2011) Evaluation of gesture based interfaces for medical volume visualization tasks. In: Proceedings of the 10th international conference on virtual reality. https://doi.org/10.1145/2087756.2087764

  8. Mewes A, Hensen B, Wacker F, Hansen C (2016) Touchless interaction with software in interventional radiology and surgery: a systematic literature review. Int J CARS 11(1):1–16. https://doi.org/10.1007/s11548-016-1480-6

    Article  Google Scholar 

  9. Chetwood A, Kwok K, Sun L, Mylonas G, Clark J, Darzi A, Yang G (2012) Collaborative eye tracking: a potential training tool in laparoscopic surgery. Surg Endosc 26(7):2003–2009. https://doi.org/10.1007/s00464-011-2143-x

    Article  PubMed  Google Scholar 

  10. Ali S, Reisner L, King B, Cao A, Auner G, Klein M, Pandya A (2007) Eye gaze tracking for endoscopic camera positioning: an application of a hardware/software interface developed to automate Aesop. Stud Health Technol Inform 132:47 PMID:18391246

    Google Scholar 

  11. Glaser B, Unger M, Schellenberg T, Neumuth T (2015) Use cases für sterile blickgesteuerte Mensch-Maschine-interaktionskonzepte im digitalen operationssaal. In: 14. Jahrestagung der Gesellschaft für Computer- und Roboterassistierte Chirurgie e.V. - CURAC, Bremen, September 2015

  12. Kangas J, Akkil D, Rantala J, Isokoski P, Majaranta P, Raisamo R (2014) Gaze gestures and haptic feedback in mobile devices. In: SIGCHI conference on human factors in computing systems, Toronto, April 2014. https://doi.org/10.1145/2556288.2557040

  13. Park Y, Kim J, Lee K (2015) Effects of auditory feedback on menu selection in hand-gesture interfaces. IEEE Multimed 22(1):32–40. https://doi.org/10.1109/MMUL.2015.5

    Article  Google Scholar 

  14. Black D, Hansen C, Nabavi A, Kikinis R, Hahn H (2017) A survey of auditory display in image-guided interventions. Int J CARS. https://doi.org/10.1007/s11548-017-1547-z

    Google Scholar 

  15. Puckette M (1996) Pure data: another integrated computer music environment. In: Second intercollege computer music concerts. Tachikawa, Japan, pp 37–41

  16. Hermann T (2008) Taxonomy and definitions for sonification and auditory display. In: 14th international conference on auditory display, Paris, France, 2008

  17. Van der Laan J, Heino A, de Waard D (1997) A simple procedure for the assessment of acceptance of advanced transport telematics. Transp Res Part C Emerg Technol 5:1–10. https://doi.org/10.1016/S0968-090X(96)00025-3

    Article  Google Scholar 

  18. Byers J, Bittner A, Hill S (1989) Traditional and raw task load index (TLX) correlations: are paired comparisons necessary? In: Mital A (ed) Advances in industrial ergonomics and safety. Taylor and Francis, Routledge, pp 481–485

    Google Scholar 

  19. Hart SG (2006) NASA-task load index (NASA-TLX); 20 years Later. In: Human factors and ergonomics society 50th annual meeting. HFES, Santa Monica, pp 904–908

  20. Jain A, Bansal R, Kumar A, Singh K (2015) A comparative study of visual and auditory reaction times on the basis of gender and physical activity levels of medical first year students. Int J Appl Basic Med Res 5(2):124–127. https://doi.org/10.4103/2229-516X.157168

    Article  PubMed  PubMed Central  Google Scholar 

  21. Field A (2013) Discovering statistics using IBM SPSS statistics. Sage Publishing, Thousand Oaks

    Google Scholar 

  22. Bork F, Fuerst B, Schneider A, Pinto F, Graumann C, Navab N (2015) Auditory and visio-temporal distance coding for 3-dimensional perception in medical augmented reality. In: Proceedings of 2015 IEEE international symposium on mixed and augmented reality (ISMAR), pp 7–12. https://doi.org/10.1109/ISMAR.2015.16

  23. Cho B, Oka M, Matsumoto N, Ouchida R, Hong J, Hashizume M (2013) Warning navigation system using realtime safe region monitoring for otologic surgery. Int J CARS 8:395–405. https://doi.org/10.1007/s11548-012-0797-z

    Article  Google Scholar 

  24. Hansen C, Black D, Lange C, Rieber F, Lamadé W, Donati M, Oldhafer K, Hahn H (2013) Auditory support for resection guidance in navigated liver surgery. Med Robot Comput Assist Surg 9(1):36. https://doi.org/10.1002/rcs.1466

    Article  Google Scholar 

  25. Kitagawa M, Dokko D, Okamura A, Yuh D (2005) Effect of sensory substitution on suture-manipulation forces for robotic surgical systems. Thorac Cardiovasc Surg 129(1):151–8. https://doi.org/10.1016/j.jtcvs.2004.05.029

    Article  Google Scholar 

  26. Willems P, Noordmans H, van Overbeeke J, Viergever M, Tulleken C, van der Sprenkel J (2005) The impact of auditory feedback on neuronavigation. Acta Neurochir 147:167–173. https://doi.org/10.1007/s00701-004-0412-3

    Article  CAS  PubMed  Google Scholar 

  27. Katz J (2014) Noise in the operating room. Anesthesiology 121(4):894–8. https://doi.org/10.1097/ALN.0000000000000319

    Article  PubMed  Google Scholar 

  28. Moorthy K, Munz Y, Undre S, Darzi A (2004) Objective evaluation of the effect of noise on the performance of a complex laparoscopic task. Surgery 136(1):25–30. https://doi.org/10.1016/j.surg.2003.12.011 (Discussion 31)

    Article  CAS  PubMed  Google Scholar 

  29. Rockstroh M, Franke S, Hofer M, Will A, Kasparick M, Andersen B, Neumuth T (2017) OR.NET: multi-perspective qualitative evaluation of an integrated operating room based on IEEE 11073 SDC. Int J CARS 12:1461–1469

    Article  CAS  Google Scholar 

  30. Blattner M, Sumikawa D, Greenberg R (1989) Earcons and icons: their structure and common design principles. Hum Comput Interact 4(1):11–44. https://doi.org/10.1207/s15327051hci0401_1

    Article  Google Scholar 

Download references

Acknowledgements

The study was partly supported by National Institutes of Health Grants P41 EB015902, P41 EB015898, R01EB014955, and U24CA180918.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David Black.

Ethics declarations

Conflict of interest

The authors state that they have no conflict of interest.

Ethical approval

For this kind of study no formal ethics approval is required by the institutional ethics committee.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Black, D., Unger, M., Fischer, N. et al. Auditory display as feedback for a novel eye-tracking system for sterile operating room interaction. Int J CARS 13, 37–45 (2018). https://doi.org/10.1007/s11548-017-1677-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-017-1677-3

Keywords