Co-localized augmented human and X-ray observers in collaborative surgical ecosystem

Javad Fotouhi, Mathias Unberath, Tianyu Song, Jonas Hajek, Sing Chun Lee, Bastian Bier, Andreas Maier, Greg Osgood, Mehran Armand, Nassir Navab

Research output: Contribution to journalArticle

Abstract

Purpose: Image-guided percutaneous interventions are safer alternatives to conventional orthopedic and trauma surgeries. To advance surgical tools in complex bony structures during these procedures with confidence, a large number of images is acquired. While image-guidance is the de facto standard to guarantee acceptable outcome, when these images are presented on monitors far from the surgical site the information content cannot be associated easily with the 3D patient anatomy. Methods: In this article, we propose a collaborative augmented reality (AR) surgical ecosystem to jointly co-localize the C-arm X-ray and surgeon viewer. The technical contributions of this work include (1) joint calibration of a visual tracker on a C-arm scanner and its X-ray source via a hand-eye calibration strategy, and (2) inside-out co-localization of human and X-ray observers in shared tracking and augmentation environments using vision-based simultaneous localization and mapping. Results: We present a thorough evaluation of the hand-eye calibration procedure. Results suggest convergence when using 50 pose pairs or more. The mean translation and rotation errors at convergence are 5.7 mm and 0. 26 , respectively. Further, user-in-the-loop studies were conducted to estimate the end-to-end target augmentation error. The mean distance between landmarks in real and virtual environment was 10.8 mm. Conclusions: The proposed AR solution provides a shared augmented experience between the human and X-ray viewer. The collaborative surgical AR system has the potential to simplify hand-eye coordination for surgeons or intuitively inform C-arm technologists for prospective X-ray view-point planning.

Fingerprint

Ecosystems
Ecosystem
X-Rays
Augmented reality
X rays
Calibration
Arm
Hand
Orthopedics
Surgery
Virtual reality
Anatomy
Joints
Planning
Wounds and Injuries
Surgeons

Keywords

  • Augmented reality
  • C-arm
  • Calibration
  • Surgery
  • X-ray

ASJC Scopus subject areas

  • Surgery
  • Biomedical Engineering
  • Radiology Nuclear Medicine and imaging
  • Computer Vision and Pattern Recognition
  • Health Informatics
  • Computer Science Applications
  • Computer Graphics and Computer-Aided Design

Cite this

Co-localized augmented human and X-ray observers in collaborative surgical ecosystem. / Fotouhi, Javad; Unberath, Mathias; Song, Tianyu; Hajek, Jonas; Lee, Sing Chun; Bier, Bastian; Maier, Andreas; Osgood, Greg; Armand, Mehran; Navab, Nassir.

In: International Journal of Computer Assisted Radiology and Surgery, 01.01.2019.

Research output: Contribution to journalArticle

Fotouhi, Javad ; Unberath, Mathias ; Song, Tianyu ; Hajek, Jonas ; Lee, Sing Chun ; Bier, Bastian ; Maier, Andreas ; Osgood, Greg ; Armand, Mehran ; Navab, Nassir. / Co-localized augmented human and X-ray observers in collaborative surgical ecosystem. In: International Journal of Computer Assisted Radiology and Surgery. 2019.
@article{16f0b685b58d4b56bb44b0b08711f155,
title = "Co-localized augmented human and X-ray observers in collaborative surgical ecosystem",
abstract = "Purpose: Image-guided percutaneous interventions are safer alternatives to conventional orthopedic and trauma surgeries. To advance surgical tools in complex bony structures during these procedures with confidence, a large number of images is acquired. While image-guidance is the de facto standard to guarantee acceptable outcome, when these images are presented on monitors far from the surgical site the information content cannot be associated easily with the 3D patient anatomy. Methods: In this article, we propose a collaborative augmented reality (AR) surgical ecosystem to jointly co-localize the C-arm X-ray and surgeon viewer. The technical contributions of this work include (1) joint calibration of a visual tracker on a C-arm scanner and its X-ray source via a hand-eye calibration strategy, and (2) inside-out co-localization of human and X-ray observers in shared tracking and augmentation environments using vision-based simultaneous localization and mapping. Results: We present a thorough evaluation of the hand-eye calibration procedure. Results suggest convergence when using 50 pose pairs or more. The mean translation and rotation errors at convergence are 5.7 mm and 0. 26 ∘, respectively. Further, user-in-the-loop studies were conducted to estimate the end-to-end target augmentation error. The mean distance between landmarks in real and virtual environment was 10.8 mm. Conclusions: The proposed AR solution provides a shared augmented experience between the human and X-ray viewer. The collaborative surgical AR system has the potential to simplify hand-eye coordination for surgeons or intuitively inform C-arm technologists for prospective X-ray view-point planning.",
keywords = "Augmented reality, C-arm, Calibration, Surgery, X-ray",
author = "Javad Fotouhi and Mathias Unberath and Tianyu Song and Jonas Hajek and Lee, {Sing Chun} and Bastian Bier and Andreas Maier and Greg Osgood and Mehran Armand and Nassir Navab",
year = "2019",
month = "1",
day = "1",
doi = "10.1007/s11548-019-02035-8",
language = "English (US)",
journal = "Computer-Assisted Radiology and Surgery",
issn = "1861-6410",
publisher = "Springer Verlag",

}

TY - JOUR

T1 - Co-localized augmented human and X-ray observers in collaborative surgical ecosystem

AU - Fotouhi, Javad

AU - Unberath, Mathias

AU - Song, Tianyu

AU - Hajek, Jonas

AU - Lee, Sing Chun

AU - Bier, Bastian

AU - Maier, Andreas

AU - Osgood, Greg

AU - Armand, Mehran

AU - Navab, Nassir

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Purpose: Image-guided percutaneous interventions are safer alternatives to conventional orthopedic and trauma surgeries. To advance surgical tools in complex bony structures during these procedures with confidence, a large number of images is acquired. While image-guidance is the de facto standard to guarantee acceptable outcome, when these images are presented on monitors far from the surgical site the information content cannot be associated easily with the 3D patient anatomy. Methods: In this article, we propose a collaborative augmented reality (AR) surgical ecosystem to jointly co-localize the C-arm X-ray and surgeon viewer. The technical contributions of this work include (1) joint calibration of a visual tracker on a C-arm scanner and its X-ray source via a hand-eye calibration strategy, and (2) inside-out co-localization of human and X-ray observers in shared tracking and augmentation environments using vision-based simultaneous localization and mapping. Results: We present a thorough evaluation of the hand-eye calibration procedure. Results suggest convergence when using 50 pose pairs or more. The mean translation and rotation errors at convergence are 5.7 mm and 0. 26 ∘, respectively. Further, user-in-the-loop studies were conducted to estimate the end-to-end target augmentation error. The mean distance between landmarks in real and virtual environment was 10.8 mm. Conclusions: The proposed AR solution provides a shared augmented experience between the human and X-ray viewer. The collaborative surgical AR system has the potential to simplify hand-eye coordination for surgeons or intuitively inform C-arm technologists for prospective X-ray view-point planning.

AB - Purpose: Image-guided percutaneous interventions are safer alternatives to conventional orthopedic and trauma surgeries. To advance surgical tools in complex bony structures during these procedures with confidence, a large number of images is acquired. While image-guidance is the de facto standard to guarantee acceptable outcome, when these images are presented on monitors far from the surgical site the information content cannot be associated easily with the 3D patient anatomy. Methods: In this article, we propose a collaborative augmented reality (AR) surgical ecosystem to jointly co-localize the C-arm X-ray and surgeon viewer. The technical contributions of this work include (1) joint calibration of a visual tracker on a C-arm scanner and its X-ray source via a hand-eye calibration strategy, and (2) inside-out co-localization of human and X-ray observers in shared tracking and augmentation environments using vision-based simultaneous localization and mapping. Results: We present a thorough evaluation of the hand-eye calibration procedure. Results suggest convergence when using 50 pose pairs or more. The mean translation and rotation errors at convergence are 5.7 mm and 0. 26 ∘, respectively. Further, user-in-the-loop studies were conducted to estimate the end-to-end target augmentation error. The mean distance between landmarks in real and virtual environment was 10.8 mm. Conclusions: The proposed AR solution provides a shared augmented experience between the human and X-ray viewer. The collaborative surgical AR system has the potential to simplify hand-eye coordination for surgeons or intuitively inform C-arm technologists for prospective X-ray view-point planning.

KW - Augmented reality

KW - C-arm

KW - Calibration

KW - Surgery

KW - X-ray

UR - http://www.scopus.com/inward/record.url?scp=85069870079&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85069870079&partnerID=8YFLogxK

U2 - 10.1007/s11548-019-02035-8

DO - 10.1007/s11548-019-02035-8

M3 - Article

C2 - 31350704

AN - SCOPUS:85069870079

JO - Computer-Assisted Radiology and Surgery

JF - Computer-Assisted Radiology and Surgery

SN - 1861-6410

ER -