Data processing methods for a high throughput brain imaging PET research center

Judson P. Jones, Arman Rahmim, Merence Sibomana, Andrew Crabb, Ziad Burbar, Charles B. Cavanaugh, Christian Michel, Dean Foster Wong

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We describe a computer system designed to meet the data processing needs of a high-volume brain PET research center based on the High Resolution Research Tomograph (HRRT). Listmode data are collected by an acquisition computer and stored on a high-speed disk. A workflow management program transfers the data through a gigabit network, rebins events into sinograms, and calculates correction factors. Reconstruction jobs are performed on a 64 processor cluster. We developed methods for dynamically allocating subclusters from the pool of available nodes, and reconstructing multiple images on multiple subclusters simultaneously. We also studied overall workflow. In our initial plan, scatter and randoms calculation unexpectedly became a bottleneck. We therefore adjusted our plan so that scatter estimation was performed initially in low resolution, and later expanded to high resolution.

Original languageEnglish (US)
Title of host publicationIEEE Nuclear Science Symposium Conference Record
Pages2224-2228
Number of pages5
Volume4
DOIs
StatePublished - 2007
Event2006 IEEE Nuclear Science Symposium, Medical Imaging Conference and 15th International Workshop on Room-Temperature Semiconductor X- and Gamma-Ray Detectors, Special Focus Workshops, NSS/MIC/RTSD - San Diego, CA, United States
Duration: Oct 29 2006Nov 4 2006

Other

Other2006 IEEE Nuclear Science Symposium, Medical Imaging Conference and 15th International Workshop on Room-Temperature Semiconductor X- and Gamma-Ray Detectors, Special Focus Workshops, NSS/MIC/RTSD
CountryUnited States
CitySan Diego, CA
Period10/29/0611/4/06

Fingerprint

Brain
Throughput
Imaging techniques
Computer systems

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Industrial and Manufacturing Engineering

Cite this

Jones, J. P., Rahmim, A., Sibomana, M., Crabb, A., Burbar, Z., Cavanaugh, C. B., ... Wong, D. F. (2007). Data processing methods for a high throughput brain imaging PET research center. In IEEE Nuclear Science Symposium Conference Record (Vol. 4, pp. 2224-2228). [4179470] https://doi.org/10.1109/NSSMIC.2006.354356

Data processing methods for a high throughput brain imaging PET research center. / Jones, Judson P.; Rahmim, Arman; Sibomana, Merence; Crabb, Andrew; Burbar, Ziad; Cavanaugh, Charles B.; Michel, Christian; Wong, Dean Foster.

IEEE Nuclear Science Symposium Conference Record. Vol. 4 2007. p. 2224-2228 4179470.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Jones, JP, Rahmim, A, Sibomana, M, Crabb, A, Burbar, Z, Cavanaugh, CB, Michel, C & Wong, DF 2007, Data processing methods for a high throughput brain imaging PET research center. in IEEE Nuclear Science Symposium Conference Record. vol. 4, 4179470, pp. 2224-2228, 2006 IEEE Nuclear Science Symposium, Medical Imaging Conference and 15th International Workshop on Room-Temperature Semiconductor X- and Gamma-Ray Detectors, Special Focus Workshops, NSS/MIC/RTSD, San Diego, CA, United States, 10/29/06. https://doi.org/10.1109/NSSMIC.2006.354356
Jones JP, Rahmim A, Sibomana M, Crabb A, Burbar Z, Cavanaugh CB et al. Data processing methods for a high throughput brain imaging PET research center. In IEEE Nuclear Science Symposium Conference Record. Vol. 4. 2007. p. 2224-2228. 4179470 https://doi.org/10.1109/NSSMIC.2006.354356
Jones, Judson P. ; Rahmim, Arman ; Sibomana, Merence ; Crabb, Andrew ; Burbar, Ziad ; Cavanaugh, Charles B. ; Michel, Christian ; Wong, Dean Foster. / Data processing methods for a high throughput brain imaging PET research center. IEEE Nuclear Science Symposium Conference Record. Vol. 4 2007. pp. 2224-2228
@inproceedings{da9231e2bf5a41ac8d826b27bb3bd0b5,
title = "Data processing methods for a high throughput brain imaging PET research center",
abstract = "We describe a computer system designed to meet the data processing needs of a high-volume brain PET research center based on the High Resolution Research Tomograph (HRRT). Listmode data are collected by an acquisition computer and stored on a high-speed disk. A workflow management program transfers the data through a gigabit network, rebins events into sinograms, and calculates correction factors. Reconstruction jobs are performed on a 64 processor cluster. We developed methods for dynamically allocating subclusters from the pool of available nodes, and reconstructing multiple images on multiple subclusters simultaneously. We also studied overall workflow. In our initial plan, scatter and randoms calculation unexpectedly became a bottleneck. We therefore adjusted our plan so that scatter estimation was performed initially in low resolution, and later expanded to high resolution.",
author = "Jones, {Judson P.} and Arman Rahmim and Merence Sibomana and Andrew Crabb and Ziad Burbar and Cavanaugh, {Charles B.} and Christian Michel and Wong, {Dean Foster}",
year = "2007",
doi = "10.1109/NSSMIC.2006.354356",
language = "English (US)",
isbn = "1424405610",
volume = "4",
pages = "2224--2228",
booktitle = "IEEE Nuclear Science Symposium Conference Record",

}

TY - GEN

T1 - Data processing methods for a high throughput brain imaging PET research center

AU - Jones, Judson P.

AU - Rahmim, Arman

AU - Sibomana, Merence

AU - Crabb, Andrew

AU - Burbar, Ziad

AU - Cavanaugh, Charles B.

AU - Michel, Christian

AU - Wong, Dean Foster

PY - 2007

Y1 - 2007

N2 - We describe a computer system designed to meet the data processing needs of a high-volume brain PET research center based on the High Resolution Research Tomograph (HRRT). Listmode data are collected by an acquisition computer and stored on a high-speed disk. A workflow management program transfers the data through a gigabit network, rebins events into sinograms, and calculates correction factors. Reconstruction jobs are performed on a 64 processor cluster. We developed methods for dynamically allocating subclusters from the pool of available nodes, and reconstructing multiple images on multiple subclusters simultaneously. We also studied overall workflow. In our initial plan, scatter and randoms calculation unexpectedly became a bottleneck. We therefore adjusted our plan so that scatter estimation was performed initially in low resolution, and later expanded to high resolution.

AB - We describe a computer system designed to meet the data processing needs of a high-volume brain PET research center based on the High Resolution Research Tomograph (HRRT). Listmode data are collected by an acquisition computer and stored on a high-speed disk. A workflow management program transfers the data through a gigabit network, rebins events into sinograms, and calculates correction factors. Reconstruction jobs are performed on a 64 processor cluster. We developed methods for dynamically allocating subclusters from the pool of available nodes, and reconstructing multiple images on multiple subclusters simultaneously. We also studied overall workflow. In our initial plan, scatter and randoms calculation unexpectedly became a bottleneck. We therefore adjusted our plan so that scatter estimation was performed initially in low resolution, and later expanded to high resolution.

UR - http://www.scopus.com/inward/record.url?scp=38649107080&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=38649107080&partnerID=8YFLogxK

U2 - 10.1109/NSSMIC.2006.354356

DO - 10.1109/NSSMIC.2006.354356

M3 - Conference contribution

AN - SCOPUS:38649107080

SN - 1424405610

SN - 9781424405619

VL - 4

SP - 2224

EP - 2228

BT - IEEE Nuclear Science Symposium Conference Record

ER -