C-arm pose estimation in prostate brachytherapy by registration to ultrasound

Pascal Fallavollita, Clif Burdette, Danny Song, Purang Abolmaesumi, Gabor Fichtinger

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In prostate brachytherapy, transrectal ultrasound (TRUS) is used to visualize the anatomy, while implanted seeds can be seen in C-arm fluoroscopy. Intra-operative dosimetry optimization requires reconstruction of the implanted seeds from multiple C-arm fluoroscopy images, which in turn requires estimation of the C-arm poses. We estimate the pose of the C-arm by two-stage registration between the 2D fluoroscopy images to a 3D TRUS volume. As single-view 2D/3D registration tends to yield depth error, we first estimate the depth from multiple 2D fluoro images and input this to a single-view 2D/3D registration. A commercial phantom was implanted with seeds and imaged with TRUS and CT. Ground-truth registration was established between the two by radiographic fiducials. Synthetic ground-truth fluoro images were created from the CT volume and registered to the 3D TRUS. The average rotation and translation errors were 1.0° (STD=2.3°) and 0.7mm (STD=1.9 mm), respectively. In data from a human patient, the average rotation and lateral translation errors were 0.6° (STD=3.0°) and 1.5 mm (STD=2.8 mm), respectively, relative to the ground-truth established by a radiographic fiducial. Fully automated image-based C-arm pose estimation was demonstrated in prostate brachytherapy. Accuracy and robustness was excellent on phantom. Early result in human patient data appears clinically adequate.

Original languageEnglish (US)
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Pages311-318
Number of pages8
Volume6363 LNCS
EditionPART 3
DOIs
StatePublished - 2010
Externally publishedYes
Event13th International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2010 - Beijing, China
Duration: Sep 20 2010Sep 24 2010

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 3
Volume6363 LNCS
ISSN (Print)03029743
ISSN (Electronic)16113349

Other

Other13th International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2010
CountryChina
CityBeijing
Period9/20/109/24/10

Fingerprint

Pose Estimation
Ultrasound
Registration
Ultrasonics
Seed
Phantom
Dosimetry
Anatomy
Estimate
Lateral
Tend
Robustness
Optimization
Mm
Truth

ASJC Scopus subject areas

  • Computer Science(all)
  • Theoretical Computer Science

Cite this

Fallavollita, P., Burdette, C., Song, D., Abolmaesumi, P., & Fichtinger, G. (2010). C-arm pose estimation in prostate brachytherapy by registration to ultrasound. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (PART 3 ed., Vol. 6363 LNCS, pp. 311-318). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 6363 LNCS, No. PART 3). https://doi.org/10.1007/978-3-642-15711-0_39

C-arm pose estimation in prostate brachytherapy by registration to ultrasound. / Fallavollita, Pascal; Burdette, Clif; Song, Danny; Abolmaesumi, Purang; Fichtinger, Gabor.

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 6363 LNCS PART 3. ed. 2010. p. 311-318 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 6363 LNCS, No. PART 3).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Fallavollita, P, Burdette, C, Song, D, Abolmaesumi, P & Fichtinger, G 2010, C-arm pose estimation in prostate brachytherapy by registration to ultrasound. in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). PART 3 edn, vol. 6363 LNCS, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), no. PART 3, vol. 6363 LNCS, pp. 311-318, 13th International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2010, Beijing, China, 9/20/10. https://doi.org/10.1007/978-3-642-15711-0_39
Fallavollita P, Burdette C, Song D, Abolmaesumi P, Fichtinger G. C-arm pose estimation in prostate brachytherapy by registration to ultrasound. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). PART 3 ed. Vol. 6363 LNCS. 2010. p. 311-318. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); PART 3). https://doi.org/10.1007/978-3-642-15711-0_39
Fallavollita, Pascal ; Burdette, Clif ; Song, Danny ; Abolmaesumi, Purang ; Fichtinger, Gabor. / C-arm pose estimation in prostate brachytherapy by registration to ultrasound. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 6363 LNCS PART 3. ed. 2010. pp. 311-318 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); PART 3).
@inproceedings{1054ec38f4194f5e8eade5f9457a5088,
title = "C-arm pose estimation in prostate brachytherapy by registration to ultrasound",
abstract = "In prostate brachytherapy, transrectal ultrasound (TRUS) is used to visualize the anatomy, while implanted seeds can be seen in C-arm fluoroscopy. Intra-operative dosimetry optimization requires reconstruction of the implanted seeds from multiple C-arm fluoroscopy images, which in turn requires estimation of the C-arm poses. We estimate the pose of the C-arm by two-stage registration between the 2D fluoroscopy images to a 3D TRUS volume. As single-view 2D/3D registration tends to yield depth error, we first estimate the depth from multiple 2D fluoro images and input this to a single-view 2D/3D registration. A commercial phantom was implanted with seeds and imaged with TRUS and CT. Ground-truth registration was established between the two by radiographic fiducials. Synthetic ground-truth fluoro images were created from the CT volume and registered to the 3D TRUS. The average rotation and translation errors were 1.0° (STD=2.3°) and 0.7mm (STD=1.9 mm), respectively. In data from a human patient, the average rotation and lateral translation errors were 0.6° (STD=3.0°) and 1.5 mm (STD=2.8 mm), respectively, relative to the ground-truth established by a radiographic fiducial. Fully automated image-based C-arm pose estimation was demonstrated in prostate brachytherapy. Accuracy and robustness was excellent on phantom. Early result in human patient data appears clinically adequate.",
author = "Pascal Fallavollita and Clif Burdette and Danny Song and Purang Abolmaesumi and Gabor Fichtinger",
year = "2010",
doi = "10.1007/978-3-642-15711-0_39",
language = "English (US)",
isbn = "3642157106",
volume = "6363 LNCS",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
number = "PART 3",
pages = "311--318",
booktitle = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
edition = "PART 3",

}

TY - GEN

T1 - C-arm pose estimation in prostate brachytherapy by registration to ultrasound

AU - Fallavollita, Pascal

AU - Burdette, Clif

AU - Song, Danny

AU - Abolmaesumi, Purang

AU - Fichtinger, Gabor

PY - 2010

Y1 - 2010

N2 - In prostate brachytherapy, transrectal ultrasound (TRUS) is used to visualize the anatomy, while implanted seeds can be seen in C-arm fluoroscopy. Intra-operative dosimetry optimization requires reconstruction of the implanted seeds from multiple C-arm fluoroscopy images, which in turn requires estimation of the C-arm poses. We estimate the pose of the C-arm by two-stage registration between the 2D fluoroscopy images to a 3D TRUS volume. As single-view 2D/3D registration tends to yield depth error, we first estimate the depth from multiple 2D fluoro images and input this to a single-view 2D/3D registration. A commercial phantom was implanted with seeds and imaged with TRUS and CT. Ground-truth registration was established between the two by radiographic fiducials. Synthetic ground-truth fluoro images were created from the CT volume and registered to the 3D TRUS. The average rotation and translation errors were 1.0° (STD=2.3°) and 0.7mm (STD=1.9 mm), respectively. In data from a human patient, the average rotation and lateral translation errors were 0.6° (STD=3.0°) and 1.5 mm (STD=2.8 mm), respectively, relative to the ground-truth established by a radiographic fiducial. Fully automated image-based C-arm pose estimation was demonstrated in prostate brachytherapy. Accuracy and robustness was excellent on phantom. Early result in human patient data appears clinically adequate.

AB - In prostate brachytherapy, transrectal ultrasound (TRUS) is used to visualize the anatomy, while implanted seeds can be seen in C-arm fluoroscopy. Intra-operative dosimetry optimization requires reconstruction of the implanted seeds from multiple C-arm fluoroscopy images, which in turn requires estimation of the C-arm poses. We estimate the pose of the C-arm by two-stage registration between the 2D fluoroscopy images to a 3D TRUS volume. As single-view 2D/3D registration tends to yield depth error, we first estimate the depth from multiple 2D fluoro images and input this to a single-view 2D/3D registration. A commercial phantom was implanted with seeds and imaged with TRUS and CT. Ground-truth registration was established between the two by radiographic fiducials. Synthetic ground-truth fluoro images were created from the CT volume and registered to the 3D TRUS. The average rotation and translation errors were 1.0° (STD=2.3°) and 0.7mm (STD=1.9 mm), respectively. In data from a human patient, the average rotation and lateral translation errors were 0.6° (STD=3.0°) and 1.5 mm (STD=2.8 mm), respectively, relative to the ground-truth established by a radiographic fiducial. Fully automated image-based C-arm pose estimation was demonstrated in prostate brachytherapy. Accuracy and robustness was excellent on phantom. Early result in human patient data appears clinically adequate.

UR - http://www.scopus.com/inward/record.url?scp=78349293777&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=78349293777&partnerID=8YFLogxK

U2 - 10.1007/978-3-642-15711-0_39

DO - 10.1007/978-3-642-15711-0_39

M3 - Conference contribution

SN - 3642157106

SN - 9783642157103

VL - 6363 LNCS

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 311

EP - 318

BT - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

ER -