TY - JOUR
T1 - Development and Pre-Clinical Analysis of Spatiotemporal-Aware Augmented Reality in Orthopedic Interventions
AU - Fotouhi, Javad
AU - Mehrfard, Arian
AU - Song, Tianyu
AU - Johnson, Alex
AU - Osgood, Greg
AU - Unberath, Mathias
AU - Armand, Mehran
AU - Navab, Nassir
N1 - Funding Information:
Manuscript received July 8, 2020; revised September 7, 2020; accepted November 1, 2020. Date of publication November 9, 2020; date of current version February 2, 2021. This work was supported in part by NIH under Award R01EB023939 and Award R01EB016703 and in part by Johns Hopkins University. (Javad Fotouhi, Arian Mehrfard, and Tianyu Song are co-first authors.) (Corresponding author: Javad Fotouhi.) Javad Fotouhi, Tianyu Song, and Mathias Unberath are with the Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD 21218 USA (e-mail: javad.fotouhi@jhu.edu; tsong11@jhu.edu; unberath@jhu.edu).
Publisher Copyright:
© 1982-2012 IEEE.
PY - 2021/2
Y1 - 2021/2
N2 - Suboptimal interaction with patient data and challenges in mastering 3D anatomy based on ill-posed 2D interventional images are essential concerns in image-guided therapies. Augmented reality (AR) has been introduced in the operating rooms in the last decade; however, in image-guided interventions, it has often only been considered as a visualization device improving traditional workflows. As a consequence, the technology is gaining minimum maturity that it requires to redefine new procedures, user interfaces, and interactions. The main contribution of this paper is to reveal how exemplary workflows are redefined by taking full advantage of head-mounted displays when entirely co-registered with the imaging system at all times. The awareness of the system from the geometric and physical characteristics of X-ray imaging allows the exploration of different human-machine interfaces. Our system achieved an error of 4.76 ± 2.91mm for placing K-wire in a fracture management procedure, and yielded errors of 1.57 ± 1.16° and 1.46 ± 1.00° in the abduction and anteversion angles, respectively, for total hip arthroplasty (THA). We compared the results with the outcomes from baseline standard operative and non-immersive AR procedures, which had yielded errors of [4.61mm, 4.76°, 4.77°] and [5.13mm, 1.78°, 1.43°], respectively, for wire placement, and abduction and anteversion during THA. We hope that our holistic approach towards improving the interface of surgery not only augments the surgeon's capabilities but also augments the surgical team's experience in carrying out an effective intervention with reduced complications and provide novel approaches of documenting procedures for training purposes.
AB - Suboptimal interaction with patient data and challenges in mastering 3D anatomy based on ill-posed 2D interventional images are essential concerns in image-guided therapies. Augmented reality (AR) has been introduced in the operating rooms in the last decade; however, in image-guided interventions, it has often only been considered as a visualization device improving traditional workflows. As a consequence, the technology is gaining minimum maturity that it requires to redefine new procedures, user interfaces, and interactions. The main contribution of this paper is to reveal how exemplary workflows are redefined by taking full advantage of head-mounted displays when entirely co-registered with the imaging system at all times. The awareness of the system from the geometric and physical characteristics of X-ray imaging allows the exploration of different human-machine interfaces. Our system achieved an error of 4.76 ± 2.91mm for placing K-wire in a fracture management procedure, and yielded errors of 1.57 ± 1.16° and 1.46 ± 1.00° in the abduction and anteversion angles, respectively, for total hip arthroplasty (THA). We compared the results with the outcomes from baseline standard operative and non-immersive AR procedures, which had yielded errors of [4.61mm, 4.76°, 4.77°] and [5.13mm, 1.78°, 1.43°], respectively, for wire placement, and abduction and anteversion during THA. We hope that our holistic approach towards improving the interface of surgery not only augments the surgeon's capabilities but also augments the surgical team's experience in carrying out an effective intervention with reduced complications and provide novel approaches of documenting procedures for training purposes.
KW - Augmented reality
KW - X-ray
KW - frustum
KW - interaction
KW - surgery
KW - visualization
UR - http://www.scopus.com/inward/record.url?scp=85096823556&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85096823556&partnerID=8YFLogxK
U2 - 10.1109/TMI.2020.3037013
DO - 10.1109/TMI.2020.3037013
M3 - Article
C2 - 33166252
AN - SCOPUS:85096823556
SN - 0278-0062
VL - 40
SP - 765
EP - 778
JO - IEEE transactions on medical imaging
JF - IEEE transactions on medical imaging
IS - 2
M1 - 9252943
ER -