Skip to main content

Hybrid Tracking for Improved Registration of Laparoscopic Ultrasound and Laparoscopic Video for Augmented Reality

  • Conference paper
  • First Online:
Computer Assisted and Robotic Endoscopy and Clinical Image-Based Procedures (CARE 2017, CLIP 2017)

Abstract

Laparoscopic augmented reality (AR), improves the surgeon’s experience of using multimodal visual data during a procedure by fusion of medical image data (e.g., ultrasound images) onto live laparoscopic video. The majority of AR studies are based on either computer vision-based or hardware-based (e.g., optical and electromagnetic tracking) approaches. However, both approaches introduce registration errors because of variable operating conditions. To alleviate this problem, we propose a novel approach of hybrid tracking which comprises of both hardware-based and computer vision-based approaches. It consists of the registration of an ultrasound image with a time-matched video frame using electromagnetic tracking followed by a computer vision-based refinement of the registration and subsequent fusion. Experimental results demonstrate not only the feasibility of the proposed concept but also improved tracking accuracy that it provides and the potential for its integration into a future clinical AR system.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Leven, J., et al.: DaVinci canvas: a telerobotic surgical system with integrated, robot-assisted, laparoscopic ultrasound capability. In: Duncan, J.S., Gerig, G. (eds.) MICCAI 2005. LNCS, vol. 3749, pp. 811–818. Springer, Heidelberg (2005). doi:10.1007/11566465_100

    Chapter  Google Scholar 

  2. Pratt, P., Jaeger, A., Hughes-Hallett, A., Mayer, E., Vale, J., Darzi, A., Peters, T., Yang, G.Z.: Robust ultrasound probe tracking: initial clinical experiences during robot-assisted partial nephrectomy. Int. J. Comput. Assist. Radiol. Surg. 10(12), 1905–1913 (2015)

    Article  Google Scholar 

  3. Feuerstein, M., Reichl, T., Vogel, J., Traub, J., Navab, N.: New approaches to online estimation of electromagnetic tracking errors for laparoscopic ultrasonography. Comput. Aided Surg. 13(5), 311–323 (2008)

    Article  Google Scholar 

  4. Bouget, D., Allan, M., Stoyanov, D., Jannin, P.: Vision-based and marker-less surgical tool detection and tracking: a review of the literature. Med. Image Anal. 35, 633–654 (2017)

    Article  Google Scholar 

  5. Feuerstein, M., Mussack, T., Heining, S.M., Navab, N.: Intraoperative laparoscope augmentation for port placement and resection planning in minimally invasive liver resection. IEEE Trans. Med. Imaging 27(3), 355–369 (2008)

    Article  Google Scholar 

  6. Shekhar, R., Dandekar, O., Bhat, V., Philip, M., Lei, P., Godinez, C., Sutton, E., George, I., Kavic, S., Mezrich, R., Park, A.: Live augmented reality: a new visualization method for laparoscopic surgery using continuous volumetric computed tomography Surg. Endosc. 24(8), 1976–1985 (2010)

    Article  Google Scholar 

  7. Kang, X., Azizian, M., Wilson, E., Wu, K., Martin, A.D., Kane, T.D., Peters, C.A., Cleary, K., Shekhar, R.: Stereoscopic augmented reality for laparoscopic surgery. Surg. Endosc. 28(7), 2227–2235 (2014)

    Article  Google Scholar 

  8. Cheung, C.L., Wedlake, C., Moore, J., Pautler, S.E., Peters, T.M.: Fused video and ultrasound images for minimally invasive partial nephrectomy: a phantom study. In: Jiang, T., Navab, N., Pluim, J.P.W., Viergever, M.A. (eds.) MICCAI 2010. LNCS, vol. 6363, pp. 408–415. Springer, Heidelberg (2010). doi:10.1007/978-3-642-15711-0_51

    Chapter  Google Scholar 

  9. Liu, X., Kang, S., Plishker, W., Zaki, G., Kane, T.D., Shekhar, R.: Laparoscopic stereoscopic augmented reality: toward a clinically viable electromagnetic tracking solution. J. Med. Imaging (Bellingham) 3(4), 045001 (2016)

    Article  Google Scholar 

  10. Franz, A.M., Haidegger, T., Birkfellner, W., Cleary, K., Peters, T.M., Maier-Hein, L.: Electromagnetic tracking in medicine – a review of technology, validation, and applications. IEEE Trans. Med. Imaging 33(8), 1702–1725 (2014)

    Article  Google Scholar 

  11. Lasso, A., Heffter, T., Rankin, A., Pinter, C., Ungi, T., Fichtinger, G.: PLUS: open-source toolkit for ultrasound-guided intervention systems. IEEE Trans. Biomed. Eng. 61(10), 2527–2537 (2014)

    Article  Google Scholar 

  12. Liu, X., Plishker, W., Zaki, G., Kang, S., Kane, T.D., Shekhar, R.: On-demand calibration and evaluation for electromagnetically tracked laparoscope in augmented reality visualization. Int. J. Comput. Assist. Radiol. Surg. 11(6), 1163–1171 (2016)

    Article  Google Scholar 

  13. Danelljan, M., Häger, G., Khan, F., Felsberg, M.: Accurate Scale Estimation for Robust Visual Tracking. BMCV (2014)

    Google Scholar 

  14. Chan, T.F., Vese, L.A.: Active contours without edges. IEEE Trans. Image Process. 10(2), 266–277 (2001)

    Article  MATH  Google Scholar 

  15. Nelder, J.A., Mead, R.A.: A simplex method for function minimization. Comput. J. 7, 308–313 (1965)

    Article  MathSciNet  MATH  Google Scholar 

  16. van den Braak, G.-J., Nugteren, C., Mesman, B., Corporaal, H.: Fast Hough Transform on GPUs: exploration of algorithm trade-offs. In: Blanc-Talon, J., Kleihorst, R., Philips, W., Popescu, D., Scheunders, P. (eds.) ACIVS 2011. LNCS, vol. 6915, pp. 611–622. Springer, Heidelberg (2011). doi:10.1007/978-3-642-23687-7_55

    Google Scholar 

  17. van der Walt, S., Schönberger, J., Nunez-Iglesias, J., Boulogne, F., Warner, J., Yager, N., Gouillart, E., Yu, T., The scikit-image Contributors: scikit-image scikit-image: Image processing in Python. PeerJ 2, e453 (2014). http://dx.doi.org/10.7717/peerj.453

Download references

Acknowledgement

This work was supported by the National Institutes of Health/National Cancer Institute under Grant CA192504.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to William Plishker .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Plishker, W., Liu, X., Shekhar, R. (2017). Hybrid Tracking for Improved Registration of Laparoscopic Ultrasound and Laparoscopic Video for Augmented Reality. In: Cardoso, M., et al. Computer Assisted and Robotic Endoscopy and Clinical Image-Based Procedures. CARE CLIP 2017 2017. Lecture Notes in Computer Science(), vol 10550. Springer, Cham. https://doi.org/10.1007/978-3-319-67543-5_17

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-67543-5_17

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-67542-8

  • Online ISBN: 978-3-319-67543-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics