TY - GEN
T1 - Data-driven detection and registration of spine surgery instrumentation in intraoperative images
AU - Doerr, S. A.
AU - Uneri, Ali
AU - Huang, Y.
AU - Jones, C. K.
AU - Zhang, X.
AU - Ketcha, M. D.
AU - Helm, P. A.
AU - Siewerdsen, J. H.
N1 - Funding Information:
This research was supported by NIH grant R01-EB-017226 and research collaboration with Medtronic (Littleton MA). The authors extend their thanks to Ned Lipes, Dr. Seunghoon Nam, Dr. Shuanghe Shi, and Dr. Andre Souza (Medtronic) for many helpful discussions and provision of device design files used in this work. The system and algorithms shown in this work were for research purposes only and are not available for sale or commercial use.
Publisher Copyright:
© 2020 SPIE.
PY - 2020
Y1 - 2020
N2 - Purpose. Conventional model-based 3D-2D registration algorithms can be challenged by limited capture range, model validity, and stringent intraoperative runtime requirements. In this work, a deep convolutional neural network was used to provide robust initialization of a registration algorithm (known-component registration, KC-Reg) for 3D localization of spine surgery implants, combining the speed and global support of data-driven approaches with the previously demonstrated accuracy of model-based registration. Methods. The approach uses a Faster R-CNN architecture to detect and localize a broad variety and orientation of spinal pedicle screws in clinical images. Training data were generated using projections from 17 clinical cone-beam CT scans and a library of screw models to simulate implants. Network output was processed to provide screw count and 2D poses. The network was tested on two test datasets of 2,000 images, each depicting real anatomy and realistic spine surgery instrumentation - one dataset involving the same patient data as in the training set (but with different screws, poses, image noise, and affine transformations) and one dataset with five patients unseen in the test data. Assessment of device detection was quantified in terms of accuracy and specificity, and localization accuracy was evaluated in terms of intersection-overunion (IOU) and distance between true and predicted bounding box coordinates. Results. The overall accuracy of pedicle screw detection was ∼86.6% (85.3% for the same-patient dataset and 87.8% for the many-patient dataset), suggesting that the screw detection network performed reasonably well irrespective of disparate, complex anatomical backgrounds. The precision of screw detection was ∼92.6% (95.0% and 90.2% for the respective same-patient and many-patient datasets). The accuracy of screw localization was within 1.5 mm (median difference of bounding box coordinates), and median IOU exceeded 0.85. For purposes of initializing a 3D-2D registration algorithm, the accuracy was observed to be well within the typical capture range of KC-Reg.1 Conclusions. Initial evaluation of network performance indicates sufficient accuracy to integrate with algorithms for implant registration, guidance, and verification in spine surgery. Such capability is of potential use in surgical navigation, robotic assistance, and data-intensive analysis of implant placement in large retrospective datasets. Future work includes correspondence of multiple views, 3D localization, screw classification, and expansion of the training dataset to a broader variety of anatomical sites, number of screws, and types of implants.
AB - Purpose. Conventional model-based 3D-2D registration algorithms can be challenged by limited capture range, model validity, and stringent intraoperative runtime requirements. In this work, a deep convolutional neural network was used to provide robust initialization of a registration algorithm (known-component registration, KC-Reg) for 3D localization of spine surgery implants, combining the speed and global support of data-driven approaches with the previously demonstrated accuracy of model-based registration. Methods. The approach uses a Faster R-CNN architecture to detect and localize a broad variety and orientation of spinal pedicle screws in clinical images. Training data were generated using projections from 17 clinical cone-beam CT scans and a library of screw models to simulate implants. Network output was processed to provide screw count and 2D poses. The network was tested on two test datasets of 2,000 images, each depicting real anatomy and realistic spine surgery instrumentation - one dataset involving the same patient data as in the training set (but with different screws, poses, image noise, and affine transformations) and one dataset with five patients unseen in the test data. Assessment of device detection was quantified in terms of accuracy and specificity, and localization accuracy was evaluated in terms of intersection-overunion (IOU) and distance between true and predicted bounding box coordinates. Results. The overall accuracy of pedicle screw detection was ∼86.6% (85.3% for the same-patient dataset and 87.8% for the many-patient dataset), suggesting that the screw detection network performed reasonably well irrespective of disparate, complex anatomical backgrounds. The precision of screw detection was ∼92.6% (95.0% and 90.2% for the respective same-patient and many-patient datasets). The accuracy of screw localization was within 1.5 mm (median difference of bounding box coordinates), and median IOU exceeded 0.85. For purposes of initializing a 3D-2D registration algorithm, the accuracy was observed to be well within the typical capture range of KC-Reg.1 Conclusions. Initial evaluation of network performance indicates sufficient accuracy to integrate with algorithms for implant registration, guidance, and verification in spine surgery. Such capability is of potential use in surgical navigation, robotic assistance, and data-intensive analysis of implant placement in large retrospective datasets. Future work includes correspondence of multiple views, 3D localization, screw classification, and expansion of the training dataset to a broader variety of anatomical sites, number of screws, and types of implants.
KW - Deep learning
KW - Image registration
KW - Image-guided surgery
KW - Intraoperative imaging
KW - Spine surgery
UR - http://www.scopus.com/inward/record.url?scp=85085246739&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85085246739&partnerID=8YFLogxK
U2 - 10.1117/12.2550052
DO - 10.1117/12.2550052
M3 - Conference contribution
AN - SCOPUS:85085246739
T3 - Proceedings of SPIE - The International Society for Optical Engineering
BT - Medical Imaging 2020
A2 - Fei, Baowei
A2 - Linte, Cristian A.
PB - SPIE
T2 - Medical Imaging 2020: Image-Guided Procedures, Robotic Interventions, and Modeling
Y2 - 16 February 2020 through 19 February 2020
ER -