Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Time-of-flight-assisted Kinect camera-based people detection for intuitive human robot cooperation in the surgical operating room

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Background

Scene supervision is a major tool to make medical robots safer and more intuitive. The paper shows an approach to efficiently use 3D cameras within the surgical operating room to enable for safe human robot interaction and action perception. Additionally the presented approach aims to make 3D camera-based scene supervision more reliable and accurate.

Methods

A camera system composed of multiple Kinect and time-of-flight cameras has been designed, implemented and calibrated. Calibration and object detection as well as people tracking methods have been designed and evaluated.

Results

The camera system shows a good registration accuracy of 0.05 m. The tracking of humans is reliable and accurate and has been evaluated in an experimental setup using operating clothing. The robot detection shows an error of around 0.04 m.

Conclusions

The robustness and accuracy of the approach allow for an integration into modern operating room. The data output can be used directly for situation and workflow detection as well as collision avoidance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16

Similar content being viewed by others

References

  1. Besl P, McKay ND (1992) A method for registration of 3-D shapes. IEEE Trans Pattern Anal Mach Intell 14(2):239–256. doi:10.1109/34.121791

    Article  Google Scholar 

  2. Beyl T, Nicolai P, Raczkowsky J, Worn H, Comparetti M, De Momi E (2013) Multi kinect people detection for intuitive and safe human robot cooperation in the operating room. In: 2013 16th international conference on advanced robotics (ICAR), pp 1–6. doi:10.1109/ICAR.2013.6766594

  3. Bischoff R, Kurth J, Schreiber G, Koeppe R, Albu-Schaeffer A, Beyer A, Eiberger O, Haddadin S, Stemmer A, Grunwald G, Hirzinger G (2010) The KUKA-DLR lightweight robot arm—a new reference platform for robotics research and manufacturing. In: 2010 41st international symposium on robotics (ISR) and 2010 6th German conference on robotics (ROBOTIK), pp 1–8

  4. Cadeddu JA, Bzostek A, Schreiner S, Barnes AC, Roberts WW, Anderson JH, Taylor RH, Kavoussi LR (1997) A robotic system for percutaneous renal access. J Urol 158(4):1589–1593

    Article  CAS  PubMed  Google Scholar 

  5. Castaneda V, Mateeus D, Navab N (2013) Stereo time-of-flight with constructive interference. In: 2013 IEEE transactions on pattern analysis and machine intelligence, p 1

  6. Culjak I, Abram D, Pribanic T, Dzapo H, Cifrek M (2012) A brief introduction to opencv. In: 2012 proceedings of the 35th international convention MIPRO, pp 1725–1730

  7. Daniele Comparetti M, Beretta E, Kunze M, De Momi E, Raczkowsky J, Ferrigno G (2014) Event-based device-behavior switching in surgical human–robot interaction. In: 2014 IEEE international conference on robotics and automation (ICRA), pp 1877–1882. doi:10.1109/ICRA.2014.6907106

  8. Faion F, Friedberger S, Zea A, Hanebeck U (2012) Intelligent sensor-scheduling for multi-kinect-tracking. In: 2012 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 3993–3999. doi:10.1109/IROS.2012.6386007

  9. Fischler MA, Bolles RC (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6):381–395. doi:10.1145/358669.358692

    Article  Google Scholar 

  10. Garland M, Le Grand S, Nickolls J, Anderson J, Hardwick J, Morton S, Phillips E, Zhang Y, Volkov V (2008) Parallel computing experiences with CUDA. IEEE Micro 28(4):13–27. doi:10.1109/MM.2008.57

    Article  Google Scholar 

  11. Guthart G, Salisbury JJ (2000) The intuitivetm telesurgery system: overview and application. In: Robotics and automation. Proceedings. ICRA ’00. IEEE international conference on, vol 1, pp 618–621. doi:10.1109/ROBOT.2000.844121

  12. Hannaford B, Rosen J, Friedman D, King H, Roan P, Cheng L, Glozman D, Ma J, Kosari S, White L (2013) Raven-II: an open platform for surgical robotics research. IEEE Trans Biomed Eng 60(4):954–959. doi:10.1109/TBME.2012.2228858

    Article  PubMed  Google Scholar 

  13. Horn BKP (1987) Closed-form solution of absolute orientation using unit quaternions. Opt Soc 4(4):629–642

    Article  Google Scholar 

  14. Karan B (2015) Calibration of Kinect-type RGB-D sensors for robotic applications. FME Trans 47:47–54

    Article  Google Scholar 

  15. Konietschke R, Hagn U, Nickl M, Jorg S, Tobergte A, Passig G, Seibold U, Le-Tien L, Kubler B, Groger M, Frohlich F, Rink C, Albu-Schaffer A, Grebenstein M, Ortmaier T, Hirzinger G (2009) The DLR Mirosurge—a robotic system for surgery. In: Robotics and automation. ICRA ’09. IEEE international conference on, pp 1589–1590. doi:10.1109/ROBOT.2009.5152361

  16. Lou Y, Wu W, Zhang H, Zhang H, Chen Y (2012) A multi-user interaction system based on kinect and wii remote. In: 2012 IEEE international conference on Multimedia and Expo workshops (ICMEW), pp 667–667. doi:10.1109/ICMEW.2012.123

  17. Nakazawa M, Mitsugami I, Makihara Y, Nakajima H, Habe H, Yamazoe H, Yagi Y (2012) Dynamic scene reconstruction using asynchronous multiple kinects. In: 2012 21st international conference on pattern recognition (ICPR), pp 469–472

  18. Nicolai P, Beyl T, Monnich H, Raczkowsky J, Worn H (2011) Op:sense—an integrated rapid development environment in the context of robot assisted surgery and operation room sensing. In: 2011 IEEE international conference on robotics and biomimetics (ROBIO), pp 2421–2422. doi:10.1109/ROBIO.2011.6181667

  19. Nicolai P, Raczkowsky J, Wrn H (2014) A novel 3D camera based supervision system for safe human–robot interaction in the operating room. J. Autom. Control Eng. 3(5):410–417

    Google Scholar 

  20. OpenNI organization: OpenNI (2010). http://www.openni.org. Last viewed 24-01-2014

  21. Nicolai P, Brennecke T, Kunze M, Schreiter L, Beyl T, Zhang Y, Mintenbeck J, Raczkowsky J, Wörn H (2013) The OP: Sense surgical robotics platform: first feasibility studies and current research. Int J Comput Assist Radiol Surg 8:136–137

    Google Scholar 

  22. Pennec X (1998) Computing the mean of geometric features application to the mean rotation. Tech. Rep. RR-3371, INRIA. http://hal.inria.fr/inria-00073318

  23. PrimeSense Inc.: Prime Sensor NITE 1.3 Algorithms notes (2010). http://www.primesense.com. Last viewed 24-01-2014

  24. Quigley M, Conley K, Gerkey B, Faust J, Foote T, Leibs J, Wheeler R, Ng AY (2009) Ros: an open-source robot operating system. In: ICRA workshop on open source software, vol 3

  25. Rusu RB, Cousins S (2011) 3D is here: Point Cloud Library (PCL). In: Proceedings of the IEEE international conference on robotics and automation (ICRA), Shanghai, China

  26. Shotton J, Fitzgibbon A, Cook M, Sharp T, Finocchio M, Moore R, Kipman A, Blake A (2011) Real-time human pose recognition in parts from single depth images. In: 2011 IEEE conference on computer vision and pattern recognition (CVPR), pp 1297–1304. doi:10.1109/CVPR.2011.5995316

  27. Beyl Tim, Nicolai Philip, Raczkowsky Jörg, Wörn Heinz (2012) Ein Kinect basiertes Überwachungssystem für Workflowerkennung und Gestensteuerung im Operationssaal. In: Tagungsband der 11. Jahrestagung der Deutschen Gesellschaft für Computer- und Roboterassistierte Chirurgie e.V. (CURAC)

  28. Wilson AD, Benko H (2010) Combining multiple depth cameras and projectors for interactions on, above and between surfaces. In: Proceedings of the 23rd annual ACM symposium on user interface software and technology, UIST ’10. ACM, New York, NY, USA, pp 273–282. doi:10.1145/1866029.1866073

  29. Zhang L, Sturm J, Cremers D, Lee D (2012) Real-time human motion tracking using multiple depth cameras. In: 2012 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 2389–2395. doi:10.1109/IROS.2012.6385968

Download references

Acknowledgments

This research was funded by the European Commissions Seventh Framework program within the projects Patient Safety in Robotic Surgery (SAFROS) under Grant No. 248960 and Active Constraints Technologies for Ill-defined or Volatile Environments (ACTIVE) under Grant No. 270460. The authors thank the EU for its financial support. The authors thank NVIDIA (USA) for providing two NVIDIA Geforce GTX Titan graphic adapters for the research shown in this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tim Beyl.

Ethics declarations

Conflict of interest

The authors state that there are no conflicts of interest.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Beyl, T., Nicolai, P., Comparetti, M.D. et al. Time-of-flight-assisted Kinect camera-based people detection for intuitive human robot cooperation in the surgical operating room. Int J CARS 11, 1329–1345 (2016). https://doi.org/10.1007/s11548-015-1318-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-015-1318-7

Keywords