Interactive Flying Frustums (IFFs): spatially aware surgical data visualization

Javad Fotouhi, Mathias Unberath, Tianyu Song, Wenhao Gu, Alex Johnson, Greg Osgood, Mehran Armand, Nassir Navab

Research output: Contribution to journalArticle

Abstract

Purpose:: As the trend toward minimally invasive and percutaneous interventions continues, the importance of appropriate surgical data visualization becomes more evident. Ineffective interventional data display techniques that yield poor ergonomics that hinder hand–eye coordination, and therefore promote frustration which can compromise on-task performance up to adverse outcome. A very common example of ineffective visualization is monitors attached to the base of mobile C-arm X-ray systems. Methods:: We present a spatially and imaging geometry-aware paradigm for visualization of fluoroscopic images using Interactive Flying Frustums (IFFs) in a mixed reality environment. We exploit the fact that the C-arm imaging geometry can be modeled as a pinhole camera giving rise to an 11-degree-of-freedom view frustum on which the X-ray image can be translated while remaining valid. Visualizing IFFs to the surgeon in an augmented reality environment intuitively unites the virtual 2D X-ray image plane and the real 3D patient anatomy. To achieve this visualization, the surgeon and C-arm are tracked relative to the same coordinate frame using image-based localization and mapping, with the augmented reality environment being delivered to the surgeon via a state-of-the-art optical see-through head-mounted display. Results:: The root-mean-squared error of C-arm source tracking after hand–eye calibration was determined as 0. 43 ± 0. 34 and 4.6±2.7mm in rotation and translation, respectively. Finally, we demonstrated the application of spatially aware data visualization for internal fixation of pelvic fractures and percutaneous vertebroplasty. Conclusion:: Our spatially aware approach to transmission image visualization effectively unites patient anatomy with X-ray images by enabling spatial image manipulation that abides image formation. Our proof-of-principle findings indicate potential applications for surgical tasks that mostly rely on orientational information such as placing the acetabular component in total hip arthroplasty, making us confident that the proposed augmented reality concept can pave the way for improving surgical performance and visuo-motor coordination in fluoroscopy-guided surgery.

Fingerprint

Data visualization
Augmented reality
Arm
Visualization
X-Rays
X rays
Anatomy
Pinhole cameras
Data Display
Internal Fracture Fixation
Vertebroplasty
Imaging techniques
Arthroplasty
Image communication systems
Frustration
Human Engineering
Geometry
Fluoroscopy
Task Performance and Analysis
Ergonomics

Keywords

  • Augmented reality
  • Fluoroscopy
  • Frustum
  • Surgical data visualization

ASJC Scopus subject areas

  • Surgery
  • Biomedical Engineering
  • Radiology Nuclear Medicine and imaging
  • Computer Vision and Pattern Recognition
  • Health Informatics
  • Computer Science Applications
  • Computer Graphics and Computer-Aided Design

Cite this

Interactive Flying Frustums (IFFs) : spatially aware surgical data visualization. / Fotouhi, Javad; Unberath, Mathias; Song, Tianyu; Gu, Wenhao; Johnson, Alex; Osgood, Greg; Armand, Mehran; Navab, Nassir.

In: International Journal of Computer Assisted Radiology and Surgery, 01.01.2019.

Research output: Contribution to journalArticle

@article{5addb0ce19944dd0a0f5220cdad5d7c7,
title = "Interactive Flying Frustums (IFFs): spatially aware surgical data visualization",
abstract = "Purpose:: As the trend toward minimally invasive and percutaneous interventions continues, the importance of appropriate surgical data visualization becomes more evident. Ineffective interventional data display techniques that yield poor ergonomics that hinder hand–eye coordination, and therefore promote frustration which can compromise on-task performance up to adverse outcome. A very common example of ineffective visualization is monitors attached to the base of mobile C-arm X-ray systems. Methods:: We present a spatially and imaging geometry-aware paradigm for visualization of fluoroscopic images using Interactive Flying Frustums (IFFs) in a mixed reality environment. We exploit the fact that the C-arm imaging geometry can be modeled as a pinhole camera giving rise to an 11-degree-of-freedom view frustum on which the X-ray image can be translated while remaining valid. Visualizing IFFs to the surgeon in an augmented reality environment intuitively unites the virtual 2D X-ray image plane and the real 3D patient anatomy. To achieve this visualization, the surgeon and C-arm are tracked relative to the same coordinate frame using image-based localization and mapping, with the augmented reality environment being delivered to the surgeon via a state-of-the-art optical see-through head-mounted display. Results:: The root-mean-squared error of C-arm source tracking after hand–eye calibration was determined as 0. 43 ∘ ± 0. 34 ∘ and 4.6±2.7mm in rotation and translation, respectively. Finally, we demonstrated the application of spatially aware data visualization for internal fixation of pelvic fractures and percutaneous vertebroplasty. Conclusion:: Our spatially aware approach to transmission image visualization effectively unites patient anatomy with X-ray images by enabling spatial image manipulation that abides image formation. Our proof-of-principle findings indicate potential applications for surgical tasks that mostly rely on orientational information such as placing the acetabular component in total hip arthroplasty, making us confident that the proposed augmented reality concept can pave the way for improving surgical performance and visuo-motor coordination in fluoroscopy-guided surgery.",
keywords = "Augmented reality, Fluoroscopy, Frustum, Surgical data visualization",
author = "Javad Fotouhi and Mathias Unberath and Tianyu Song and Wenhao Gu and Alex Johnson and Greg Osgood and Mehran Armand and Nassir Navab",
year = "2019",
month = "1",
day = "1",
doi = "10.1007/s11548-019-01943-z",
language = "English (US)",
journal = "Computer-Assisted Radiology and Surgery",
issn = "1861-6410",
publisher = "Springer Verlag",

}

TY - JOUR

T1 - Interactive Flying Frustums (IFFs)

T2 - spatially aware surgical data visualization

AU - Fotouhi, Javad

AU - Unberath, Mathias

AU - Song, Tianyu

AU - Gu, Wenhao

AU - Johnson, Alex

AU - Osgood, Greg

AU - Armand, Mehran

AU - Navab, Nassir

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Purpose:: As the trend toward minimally invasive and percutaneous interventions continues, the importance of appropriate surgical data visualization becomes more evident. Ineffective interventional data display techniques that yield poor ergonomics that hinder hand–eye coordination, and therefore promote frustration which can compromise on-task performance up to adverse outcome. A very common example of ineffective visualization is monitors attached to the base of mobile C-arm X-ray systems. Methods:: We present a spatially and imaging geometry-aware paradigm for visualization of fluoroscopic images using Interactive Flying Frustums (IFFs) in a mixed reality environment. We exploit the fact that the C-arm imaging geometry can be modeled as a pinhole camera giving rise to an 11-degree-of-freedom view frustum on which the X-ray image can be translated while remaining valid. Visualizing IFFs to the surgeon in an augmented reality environment intuitively unites the virtual 2D X-ray image plane and the real 3D patient anatomy. To achieve this visualization, the surgeon and C-arm are tracked relative to the same coordinate frame using image-based localization and mapping, with the augmented reality environment being delivered to the surgeon via a state-of-the-art optical see-through head-mounted display. Results:: The root-mean-squared error of C-arm source tracking after hand–eye calibration was determined as 0. 43 ∘ ± 0. 34 ∘ and 4.6±2.7mm in rotation and translation, respectively. Finally, we demonstrated the application of spatially aware data visualization for internal fixation of pelvic fractures and percutaneous vertebroplasty. Conclusion:: Our spatially aware approach to transmission image visualization effectively unites patient anatomy with X-ray images by enabling spatial image manipulation that abides image formation. Our proof-of-principle findings indicate potential applications for surgical tasks that mostly rely on orientational information such as placing the acetabular component in total hip arthroplasty, making us confident that the proposed augmented reality concept can pave the way for improving surgical performance and visuo-motor coordination in fluoroscopy-guided surgery.

AB - Purpose:: As the trend toward minimally invasive and percutaneous interventions continues, the importance of appropriate surgical data visualization becomes more evident. Ineffective interventional data display techniques that yield poor ergonomics that hinder hand–eye coordination, and therefore promote frustration which can compromise on-task performance up to adverse outcome. A very common example of ineffective visualization is monitors attached to the base of mobile C-arm X-ray systems. Methods:: We present a spatially and imaging geometry-aware paradigm for visualization of fluoroscopic images using Interactive Flying Frustums (IFFs) in a mixed reality environment. We exploit the fact that the C-arm imaging geometry can be modeled as a pinhole camera giving rise to an 11-degree-of-freedom view frustum on which the X-ray image can be translated while remaining valid. Visualizing IFFs to the surgeon in an augmented reality environment intuitively unites the virtual 2D X-ray image plane and the real 3D patient anatomy. To achieve this visualization, the surgeon and C-arm are tracked relative to the same coordinate frame using image-based localization and mapping, with the augmented reality environment being delivered to the surgeon via a state-of-the-art optical see-through head-mounted display. Results:: The root-mean-squared error of C-arm source tracking after hand–eye calibration was determined as 0. 43 ∘ ± 0. 34 ∘ and 4.6±2.7mm in rotation and translation, respectively. Finally, we demonstrated the application of spatially aware data visualization for internal fixation of pelvic fractures and percutaneous vertebroplasty. Conclusion:: Our spatially aware approach to transmission image visualization effectively unites patient anatomy with X-ray images by enabling spatial image manipulation that abides image formation. Our proof-of-principle findings indicate potential applications for surgical tasks that mostly rely on orientational information such as placing the acetabular component in total hip arthroplasty, making us confident that the proposed augmented reality concept can pave the way for improving surgical performance and visuo-motor coordination in fluoroscopy-guided surgery.

KW - Augmented reality

KW - Fluoroscopy

KW - Frustum

KW - Surgical data visualization

UR - http://www.scopus.com/inward/record.url?scp=85063001414&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85063001414&partnerID=8YFLogxK

U2 - 10.1007/s11548-019-01943-z

DO - 10.1007/s11548-019-01943-z

M3 - Article

C2 - 30863981

AN - SCOPUS:85063001414

JO - Computer-Assisted Radiology and Surgery

JF - Computer-Assisted Radiology and Surgery

SN - 1861-6410

ER -