An integrated augmented reality surgical navigation platform using multi-modality imaging for guidance.

An integrated augmented reality (AR) surgical navigation system that potentially improves intra-operative visualization of concealed anatomical structures. Integration of real-time tracking technology with a laser pico-projector allows the surgical surface to be augmented by projecting virtual image...

Full description

Saved in:
Bibliographic Details
Main Authors: Harley H L Chan, Stephan K Haerle, Michael J Daly, Jinzi Zheng, Lauren Philp, Marco Ferrari, Catriona M Douglas, Jonathan C Irish
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2021-01-01
Series:PLoS ONE
Online Access:https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0250558&type=printable
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850181904520708096
author Harley H L Chan
Stephan K Haerle
Michael J Daly
Jinzi Zheng
Lauren Philp
Marco Ferrari
Catriona M Douglas
Jonathan C Irish
author_facet Harley H L Chan
Stephan K Haerle
Michael J Daly
Jinzi Zheng
Lauren Philp
Marco Ferrari
Catriona M Douglas
Jonathan C Irish
author_sort Harley H L Chan
collection DOAJ
description An integrated augmented reality (AR) surgical navigation system that potentially improves intra-operative visualization of concealed anatomical structures. Integration of real-time tracking technology with a laser pico-projector allows the surgical surface to be augmented by projecting virtual images of lesions and critical structures created by multimodality imaging. We aim to quantitatively and qualitatively evaluate the performance of a prototype interactive AR surgical navigation system through a series of pre-clinical studies. Four pre-clinical animal studies using xenograft mouse models were conducted to investigate system performance. A combination of CT, PET, SPECT, and MRI images were used to augment the mouse body during image-guided procedures to assess feasibility. A phantom with machined features was employed to quantitatively estimate the system accuracy. All the image-guided procedures were successfully performed. The tracked pico-projector correctly and reliably depicted virtual images on the animal body, highlighting the location of tumour and anatomical structures. The phantom study demonstrates the system was accurate to 0.55 ± 0.33mm. This paper presents a prototype real-time tracking AR surgical navigation system that improves visualization of underlying critical structures by overlaying virtual images onto the surgical site. This proof-of-concept pre-clinical study demonstrated both the clinical applicability and high precision of the system which was noted to be accurate to <1mm.
format Article
id doaj-art-90aa6992f2f14bb28757d644fbd00e04
institution OA Journals
issn 1932-6203
language English
publishDate 2021-01-01
publisher Public Library of Science (PLoS)
record_format Article
series PLoS ONE
spelling doaj-art-90aa6992f2f14bb28757d644fbd00e042025-08-20T02:17:48ZengPublic Library of Science (PLoS)PLoS ONE1932-62032021-01-01164e025055810.1371/journal.pone.0250558An integrated augmented reality surgical navigation platform using multi-modality imaging for guidance.Harley H L ChanStephan K HaerleMichael J DalyJinzi ZhengLauren PhilpMarco FerrariCatriona M DouglasJonathan C IrishAn integrated augmented reality (AR) surgical navigation system that potentially improves intra-operative visualization of concealed anatomical structures. Integration of real-time tracking technology with a laser pico-projector allows the surgical surface to be augmented by projecting virtual images of lesions and critical structures created by multimodality imaging. We aim to quantitatively and qualitatively evaluate the performance of a prototype interactive AR surgical navigation system through a series of pre-clinical studies. Four pre-clinical animal studies using xenograft mouse models were conducted to investigate system performance. A combination of CT, PET, SPECT, and MRI images were used to augment the mouse body during image-guided procedures to assess feasibility. A phantom with machined features was employed to quantitatively estimate the system accuracy. All the image-guided procedures were successfully performed. The tracked pico-projector correctly and reliably depicted virtual images on the animal body, highlighting the location of tumour and anatomical structures. The phantom study demonstrates the system was accurate to 0.55 ± 0.33mm. This paper presents a prototype real-time tracking AR surgical navigation system that improves visualization of underlying critical structures by overlaying virtual images onto the surgical site. This proof-of-concept pre-clinical study demonstrated both the clinical applicability and high precision of the system which was noted to be accurate to <1mm.https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0250558&type=printable
spellingShingle Harley H L Chan
Stephan K Haerle
Michael J Daly
Jinzi Zheng
Lauren Philp
Marco Ferrari
Catriona M Douglas
Jonathan C Irish
An integrated augmented reality surgical navigation platform using multi-modality imaging for guidance.
PLoS ONE
title An integrated augmented reality surgical navigation platform using multi-modality imaging for guidance.
title_full An integrated augmented reality surgical navigation platform using multi-modality imaging for guidance.
title_fullStr An integrated augmented reality surgical navigation platform using multi-modality imaging for guidance.
title_full_unstemmed An integrated augmented reality surgical navigation platform using multi-modality imaging for guidance.
title_short An integrated augmented reality surgical navigation platform using multi-modality imaging for guidance.
title_sort integrated augmented reality surgical navigation platform using multi modality imaging for guidance
url https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0250558&type=printable
work_keys_str_mv AT harleyhlchan anintegratedaugmentedrealitysurgicalnavigationplatformusingmultimodalityimagingforguidance
AT stephankhaerle anintegratedaugmentedrealitysurgicalnavigationplatformusingmultimodalityimagingforguidance
AT michaeljdaly anintegratedaugmentedrealitysurgicalnavigationplatformusingmultimodalityimagingforguidance
AT jinzizheng anintegratedaugmentedrealitysurgicalnavigationplatformusingmultimodalityimagingforguidance
AT laurenphilp anintegratedaugmentedrealitysurgicalnavigationplatformusingmultimodalityimagingforguidance
AT marcoferrari anintegratedaugmentedrealitysurgicalnavigationplatformusingmultimodalityimagingforguidance
AT catrionamdouglas anintegratedaugmentedrealitysurgicalnavigationplatformusingmultimodalityimagingforguidance
AT jonathancirish anintegratedaugmentedrealitysurgicalnavigationplatformusingmultimodalityimagingforguidance
AT harleyhlchan integratedaugmentedrealitysurgicalnavigationplatformusingmultimodalityimagingforguidance
AT stephankhaerle integratedaugmentedrealitysurgicalnavigationplatformusingmultimodalityimagingforguidance
AT michaeljdaly integratedaugmentedrealitysurgicalnavigationplatformusingmultimodalityimagingforguidance
AT jinzizheng integratedaugmentedrealitysurgicalnavigationplatformusingmultimodalityimagingforguidance
AT laurenphilp integratedaugmentedrealitysurgicalnavigationplatformusingmultimodalityimagingforguidance
AT marcoferrari integratedaugmentedrealitysurgicalnavigationplatformusingmultimodalityimagingforguidance
AT catrionamdouglas integratedaugmentedrealitysurgicalnavigationplatformusingmultimodalityimagingforguidance
AT jonathancirish integratedaugmentedrealitysurgicalnavigationplatformusingmultimodalityimagingforguidance