TY - JOUR
T1 - Mixed reality interfaces for achieving desired views with robotic X-ray systems
AU - Killeen, Benjamin D.
AU - Winter, Jonas
AU - Gu, Wenhao
AU - Martin-Gomez, Alejandro
AU - Taylor, Russell H
AU - Osgood, Greg
AU - Unberath, Mathias
N1 - Publisher Copyright:
© 2022 Informa UK Limited, trading as Taylor & Francis Group.
PY - 2022
Y1 - 2022
N2 - Robotic X-ray C-arm imaging systems can precisely achieve any position and orientation relative to the patient. Informing the system, however, what pose exactly corresponds to a desired view is challenging. Currently these systems are operated by the surgeon using joysticks, but this interaction is not necessarily effective because users may be unable to efficiently actuate more than a single axis of the system simultaneously. Moreover, novel robotic imaging systems, such as the Brainlab Loop-X, allow for independent source and detector movements, adding complexity. To address this challenge, we consider complementary interfaces for the surgeon to command robotic X-ray systems effectively. Specifically, we consider three interaction paradigms: (1) the use of a pointer to specify the principal ray of the desired view, (2) the same pointer, but combined with a mixed reality environment to synchronously render digitally reconstructed radiographs from the tool’s pose, and (3) the same mixed reality environment but with a virtual X-ray source instead of the pointer. Human-in-the-loop evaluation with an attending trauma surgeon indicates that mixed reality interfaces for robotic X-ray system control are promising and may contribute to substantially reducing the number of images acquired solely during ‘fluoro hunting’ for the desired view or standard plane.
AB - Robotic X-ray C-arm imaging systems can precisely achieve any position and orientation relative to the patient. Informing the system, however, what pose exactly corresponds to a desired view is challenging. Currently these systems are operated by the surgeon using joysticks, but this interaction is not necessarily effective because users may be unable to efficiently actuate more than a single axis of the system simultaneously. Moreover, novel robotic imaging systems, such as the Brainlab Loop-X, allow for independent source and detector movements, adding complexity. To address this challenge, we consider complementary interfaces for the surgeon to command robotic X-ray systems effectively. Specifically, we consider three interaction paradigms: (1) the use of a pointer to specify the principal ray of the desired view, (2) the same pointer, but combined with a mixed reality environment to synchronously render digitally reconstructed radiographs from the tool’s pose, and (3) the same mixed reality environment but with a virtual X-ray source instead of the pointer. Human-in-the-loop evaluation with an attending trauma surgeon indicates that mixed reality interfaces for robotic X-ray system control are promising and may contribute to substantially reducing the number of images acquired solely during ‘fluoro hunting’ for the desired view or standard plane.
KW - C-arm positioning
KW - mixed reality
KW - X-ray
UR - http://www.scopus.com/inward/record.url?scp=85144234683&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85144234683&partnerID=8YFLogxK
U2 - 10.1080/21681163.2022.2154272
DO - 10.1080/21681163.2022.2154272
M3 - Article
AN - SCOPUS:85144234683
SN - 2168-1163
JO - Computer Methods in Biomechanics and Biomedical Engineering: Imaging and Visualization
JF - Computer Methods in Biomechanics and Biomedical Engineering: Imaging and Visualization
ER -