Volume 15 Supplement 1

Proceedings of the International Cancer Imaging Society (ICIS) 15th Annual Teaching Course

Open Access

Augmented reality: 3D image-guided surgery

  • Archie Hughes-Hallett1,
  • Philip Pratt2,
  • James Dilley1,
  • Justin Vale1,
  • Ara Darzi1, 2 and
  • Erik Mayer1Email author
Cancer Imaging201515(Suppl 1):O8

https://doi.org/10.1186/1470-7330-15-S1-O8

Published: 2 October 2015

Background

Over the last three decades, surgical practice has undergone a significant change with a move towards minimally invasive surgery (MIS) as the standard of care [1]. Although this has brought with it significant benefits, problems have also been associated with the advent of MIS. Perhaps the most substantial limitation associated with MIS is the loss of haptic feedback; this deficit is at its most extreme in robot-assisted surgery, where at present such feedback is lost entirely [2].

The image-enhanced operating environment looks to mitigate for the loss of haptic feedback by providing the surgeon with visual cues to the subsurface anatomy. The use of intraoperative image guidance can be divided into that used for operative planning, to facilitate the rapid identification of critical anatomical structures, for example, and that used for task execution, an example of which is tumour resection [2]. These two steps have very different requirements, with the first needing a large amount of anatomical information to be displayed without the need to account for tissue deformation or accurate registration, while the second requires less information to be displayed, but with much greater spatial accuracy.

Methods

The solution proposed herein, the image-enhanced operating environment, utilises two different imaging modalities and plays on their respective strengths to meet the differing needs of the two outlined steps of planning and execution. The platform has been built around the index procedure of robot-assisted partial nephrectomy, although its potential application extends well beyond this scope.

The first step of operative planning utilises 3D reconstructions of preoperative cross-sectional imaging manipulated via a tablet-based interface [3]. This information was displayed to the surgeon both on the tablet and within the da Vinci console using the stereo TilePro™ function (Intuitive Surgical, Sunnyvale, CA).

The second step of execution utilises optically registered intraoperative ultrasound. Using a live imaging modality mitigates for the problems of deformation often faced when trying to use preoperative imaging for high precision guidance. The ultrasound data is used to create freehand 3D reconstructions which are overlaid onto the operative view [4].

Results

To date, over 60 cases have been undertaken using the tablet-based planning component of the image enhanced operating environment. Over the course of this series, a subjective benefit has been demonstrated through the analysis of prospectively-collected questionnaire results. In addition, the platform has demonstrated objective safety, with no detrimental effects observed on outcome parameters. The use of registered ultrasound has been demonstrated in vivo [5], with results of an ex vivo study demonstrating potential efficacy awaited.

Conclusions

Replacing haptic feedback with visual cues to subsurface anatomy offers a number of potential direct and indirect benefits to the patient, including improved resection quality and a reduction in positive surgical margins. In addition to these direct benefits, the use of an image-enhanced operating environment could potentially influence case selection, where surgeons are prepared to take on cases with more challenging anatomy via a minimally invasive approach, because of the improved understanding they are given by the image guidance platform.

Declarations

Acknowledgements

The authors are grateful for support from the NIHR Biomedical Research Centre funding scheme.

Authors’ Affiliations

(1)
Department of Surgery and Cancer, Imperial College
(2)
Hamlyn Centre for Robotic Surgery, Imperial College

References

  1. Nicolau S, Soler L, Mutter D, Marescaux J: Augmented reality in laparoscopic surgical oncology. Surg Oncol. 2011, 20: 189-201. 10.1016/j.suronc.2011.07.002.PubMedView ArticleGoogle Scholar
  2. Hughes-Hallett A, Pratt P, Mayer E, Martin S, Darzi A, Vale J, Marcus H, Cundy T: Augmented Reality Partial Nephrectomy: Examining the Current Status and Future Perspectives. Urology. 2014, 83: 266-273. 10.1016/j.urology.2013.08.049.PubMedView ArticleGoogle Scholar
  3. Hughes-Hallett A, Pratt P, Mayer E, Martin S, Darzi A, Vale J: Image guidance for all - TileProTM display of three-dimensionally reconstructed images in robotic partial nephrectomy. Urology. 2014, 84: 237-42. 10.1016/j.urology.2014.02.051.PubMedView ArticleGoogle Scholar
  4. Pratt P, Hughes-Hallett A, Di Marco A, Cundy T, Mayer E, Vale J, Darzi A, Yang G-Z: Multimodal Reconstruction for Image-Guided Interventions. Proceedings of the Hamlyn Symposium. 2013, 59-60.Google Scholar
  5. Hughes-Hallett A, Pratt P, Mayer E, Di Marco A, Yang G-Z, Vale J, Darzi A: Intraoperative Ultrasound Overlay in Robot-assisted Partial Nephrectomy: First Clinical Experience. Eur Urol. 2013, 65: 671-672.PubMedView ArticleGoogle Scholar

Copyright

© Hughes-Hallett et al. 2015

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Advertisement