TY - GEN
T1 - C-arm pose estimation in prostate brachytherapy by registration to ultrasound
AU - Fallavollita, Pascal
AU - Burdette, Clif
AU - Song, Danny
AU - Abolmaesumi, Purang
AU - Fichtinger, Gabor
PY - 2010
Y1 - 2010
N2 - In prostate brachytherapy, transrectal ultrasound (TRUS) is used to visualize the anatomy, while implanted seeds can be seen in C-arm fluoroscopy. Intra-operative dosimetry optimization requires reconstruction of the implanted seeds from multiple C-arm fluoroscopy images, which in turn requires estimation of the C-arm poses. We estimate the pose of the C-arm by two-stage registration between the 2D fluoroscopy images to a 3D TRUS volume. As single-view 2D/3D registration tends to yield depth error, we first estimate the depth from multiple 2D fluoro images and input this to a single-view 2D/3D registration. A commercial phantom was implanted with seeds and imaged with TRUS and CT. Ground-truth registration was established between the two by radiographic fiducials. Synthetic ground-truth fluoro images were created from the CT volume and registered to the 3D TRUS. The average rotation and translation errors were 1.0° (STD=2.3°) and 0.7mm (STD=1.9 mm), respectively. In data from a human patient, the average rotation and lateral translation errors were 0.6° (STD=3.0°) and 1.5 mm (STD=2.8 mm), respectively, relative to the ground-truth established by a radiographic fiducial. Fully automated image-based C-arm pose estimation was demonstrated in prostate brachytherapy. Accuracy and robustness was excellent on phantom. Early result in human patient data appears clinically adequate.
AB - In prostate brachytherapy, transrectal ultrasound (TRUS) is used to visualize the anatomy, while implanted seeds can be seen in C-arm fluoroscopy. Intra-operative dosimetry optimization requires reconstruction of the implanted seeds from multiple C-arm fluoroscopy images, which in turn requires estimation of the C-arm poses. We estimate the pose of the C-arm by two-stage registration between the 2D fluoroscopy images to a 3D TRUS volume. As single-view 2D/3D registration tends to yield depth error, we first estimate the depth from multiple 2D fluoro images and input this to a single-view 2D/3D registration. A commercial phantom was implanted with seeds and imaged with TRUS and CT. Ground-truth registration was established between the two by radiographic fiducials. Synthetic ground-truth fluoro images were created from the CT volume and registered to the 3D TRUS. The average rotation and translation errors were 1.0° (STD=2.3°) and 0.7mm (STD=1.9 mm), respectively. In data from a human patient, the average rotation and lateral translation errors were 0.6° (STD=3.0°) and 1.5 mm (STD=2.8 mm), respectively, relative to the ground-truth established by a radiographic fiducial. Fully automated image-based C-arm pose estimation was demonstrated in prostate brachytherapy. Accuracy and robustness was excellent on phantom. Early result in human patient data appears clinically adequate.
UR - http://www.scopus.com/inward/record.url?scp=84883832720&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84883832720&partnerID=8YFLogxK
U2 - 10.1007/978-3-642-15711-0_39
DO - 10.1007/978-3-642-15711-0_39
M3 - Conference contribution
C2 - 20879414
AN - SCOPUS:84883832720
SN - 3642157106
SN - 9783642157103
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 311
EP - 318
BT - Medical Image Computing and Computer-Assisted Intervention, MICCAI2010 - 13th International Conference, Proceedings
T2 - 13th International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2010
Y2 - 20 September 2010 through 24 September 2010
ER -