2D ultrasound imaging based intra-fraction respiratory motion tracking for abdominal radiation therapy using machine learning

Pu Huang, Lin Su, Shuyang Chen, Kunlin Cao, Qi Song, Peter Kazanzides, Iulian Iordachita, Muyinatu A. Lediju Bell, John W. Wong, Dengwang Li, Kai Ding

Research output: Contribution to journalArticle

Abstract

We have previously developed a robotic ultrasound imaging system for motion monitoring in abdominal radiation therapy. Owing to the slow speed of ultrasound image processing, our previous system could only track abdominal motions under breath-hold. To overcome this limitation, a novel 2D-based image processing method for tracking intra-fraction respiratory motion is proposed. Fifty-seven different anatomical features acquired from 27 sets of 2D ultrasound sequences were used in this study. Three 2D ultrasound sequences were acquired with the robotic ultrasound system from three healthy volunteers. The remaining datasets were provided by the 2015 MICCAI Challenge on Liver Ultrasound Tracking. All datasets were preprocessed to extract the feature point, and a patient-specific motion pattern was extracted by principal component analysis and slow feature analysis (SFA). The tracking finds the most similar frame (or indexed frame) by a k-dimensional-tree-based nearest neighbor search for estimating the tracked object location. A template image was updated dynamically through the indexed frame to perform a fast template matching (TM) within a learned smaller search region on the incoming frame. The mean tracking error between manually annotated landmarks and the location extracted from the indexed training frame is 1.80 ± 1.42 mm. Adding a fast TM procedure within a small search region reduces the mean tracking error to 1.14 ± 1.16 mm. The tracking time per frame is 15 ms, which is well below the frame acquisition time. Furthermore, the anatomical reproducibility was measured by analyzing the location's anatomical landmark relative to the probe; the position-controlled probe has better reproducibility and yields a smaller mean error across all three volunteer cases, compared to the force-controlled probe (2.69 versus 11.20 mm in the superior-inferior direction and 1.19 versus 8.21 mm in the anterior-posterior direction). Our method reduces the processing time for tracking respiratory motion significantly, which can reduce the delivery uncertainty.

Original languageEnglish (US)
Article number185006
JournalPhysics in medicine and biology
Volume64
Issue number18
DOIs
StatePublished - Sep 11 2019

Fingerprint

Ultrasonography
Radiotherapy
Robotics
Principal Component Analysis
Uncertainty
Volunteers
Healthy Volunteers
Machine Learning
Liver
Direction compound
Datasets

Keywords

  • principal component analysis
  • slow feature analysis
  • template matching
  • tracking
  • ultrasound

ASJC Scopus subject areas

  • Radiological and Ultrasound Technology
  • Radiology Nuclear Medicine and imaging

Cite this

2D ultrasound imaging based intra-fraction respiratory motion tracking for abdominal radiation therapy using machine learning. / Huang, Pu; Su, Lin; Chen, Shuyang; Cao, Kunlin; Song, Qi; Kazanzides, Peter; Iordachita, Iulian; Lediju Bell, Muyinatu A.; Wong, John W.; Li, Dengwang; Ding, Kai.

In: Physics in medicine and biology, Vol. 64, No. 18, 185006, 11.09.2019.

Research output: Contribution to journalArticle

Huang, Pu ; Su, Lin ; Chen, Shuyang ; Cao, Kunlin ; Song, Qi ; Kazanzides, Peter ; Iordachita, Iulian ; Lediju Bell, Muyinatu A. ; Wong, John W. ; Li, Dengwang ; Ding, Kai. / 2D ultrasound imaging based intra-fraction respiratory motion tracking for abdominal radiation therapy using machine learning. In: Physics in medicine and biology. 2019 ; Vol. 64, No. 18.
@article{643e3b2354ee4af0a92ccd675e8c94eb,
title = "2D ultrasound imaging based intra-fraction respiratory motion tracking for abdominal radiation therapy using machine learning",
abstract = "We have previously developed a robotic ultrasound imaging system for motion monitoring in abdominal radiation therapy. Owing to the slow speed of ultrasound image processing, our previous system could only track abdominal motions under breath-hold. To overcome this limitation, a novel 2D-based image processing method for tracking intra-fraction respiratory motion is proposed. Fifty-seven different anatomical features acquired from 27 sets of 2D ultrasound sequences were used in this study. Three 2D ultrasound sequences were acquired with the robotic ultrasound system from three healthy volunteers. The remaining datasets were provided by the 2015 MICCAI Challenge on Liver Ultrasound Tracking. All datasets were preprocessed to extract the feature point, and a patient-specific motion pattern was extracted by principal component analysis and slow feature analysis (SFA). The tracking finds the most similar frame (or indexed frame) by a k-dimensional-tree-based nearest neighbor search for estimating the tracked object location. A template image was updated dynamically through the indexed frame to perform a fast template matching (TM) within a learned smaller search region on the incoming frame. The mean tracking error between manually annotated landmarks and the location extracted from the indexed training frame is 1.80 ± 1.42 mm. Adding a fast TM procedure within a small search region reduces the mean tracking error to 1.14 ± 1.16 mm. The tracking time per frame is 15 ms, which is well below the frame acquisition time. Furthermore, the anatomical reproducibility was measured by analyzing the location's anatomical landmark relative to the probe; the position-controlled probe has better reproducibility and yields a smaller mean error across all three volunteer cases, compared to the force-controlled probe (2.69 versus 11.20 mm in the superior-inferior direction and 1.19 versus 8.21 mm in the anterior-posterior direction). Our method reduces the processing time for tracking respiratory motion significantly, which can reduce the delivery uncertainty.",
keywords = "principal component analysis, slow feature analysis, template matching, tracking, ultrasound",
author = "Pu Huang and Lin Su and Shuyang Chen and Kunlin Cao and Qi Song and Peter Kazanzides and Iulian Iordachita and {Lediju Bell}, {Muyinatu A.} and Wong, {John W.} and Dengwang Li and Kai Ding",
year = "2019",
month = "9",
day = "11",
doi = "10.1088/1361-6560/ab33db",
language = "English (US)",
volume = "64",
journal = "Physics in Medicine and Biology",
issn = "0031-9155",
publisher = "IOP Publishing Ltd.",
number = "18",

}

TY - JOUR

T1 - 2D ultrasound imaging based intra-fraction respiratory motion tracking for abdominal radiation therapy using machine learning

AU - Huang, Pu

AU - Su, Lin

AU - Chen, Shuyang

AU - Cao, Kunlin

AU - Song, Qi

AU - Kazanzides, Peter

AU - Iordachita, Iulian

AU - Lediju Bell, Muyinatu A.

AU - Wong, John W.

AU - Li, Dengwang

AU - Ding, Kai

PY - 2019/9/11

Y1 - 2019/9/11

N2 - We have previously developed a robotic ultrasound imaging system for motion monitoring in abdominal radiation therapy. Owing to the slow speed of ultrasound image processing, our previous system could only track abdominal motions under breath-hold. To overcome this limitation, a novel 2D-based image processing method for tracking intra-fraction respiratory motion is proposed. Fifty-seven different anatomical features acquired from 27 sets of 2D ultrasound sequences were used in this study. Three 2D ultrasound sequences were acquired with the robotic ultrasound system from three healthy volunteers. The remaining datasets were provided by the 2015 MICCAI Challenge on Liver Ultrasound Tracking. All datasets were preprocessed to extract the feature point, and a patient-specific motion pattern was extracted by principal component analysis and slow feature analysis (SFA). The tracking finds the most similar frame (or indexed frame) by a k-dimensional-tree-based nearest neighbor search for estimating the tracked object location. A template image was updated dynamically through the indexed frame to perform a fast template matching (TM) within a learned smaller search region on the incoming frame. The mean tracking error between manually annotated landmarks and the location extracted from the indexed training frame is 1.80 ± 1.42 mm. Adding a fast TM procedure within a small search region reduces the mean tracking error to 1.14 ± 1.16 mm. The tracking time per frame is 15 ms, which is well below the frame acquisition time. Furthermore, the anatomical reproducibility was measured by analyzing the location's anatomical landmark relative to the probe; the position-controlled probe has better reproducibility and yields a smaller mean error across all three volunteer cases, compared to the force-controlled probe (2.69 versus 11.20 mm in the superior-inferior direction and 1.19 versus 8.21 mm in the anterior-posterior direction). Our method reduces the processing time for tracking respiratory motion significantly, which can reduce the delivery uncertainty.

AB - We have previously developed a robotic ultrasound imaging system for motion monitoring in abdominal radiation therapy. Owing to the slow speed of ultrasound image processing, our previous system could only track abdominal motions under breath-hold. To overcome this limitation, a novel 2D-based image processing method for tracking intra-fraction respiratory motion is proposed. Fifty-seven different anatomical features acquired from 27 sets of 2D ultrasound sequences were used in this study. Three 2D ultrasound sequences were acquired with the robotic ultrasound system from three healthy volunteers. The remaining datasets were provided by the 2015 MICCAI Challenge on Liver Ultrasound Tracking. All datasets were preprocessed to extract the feature point, and a patient-specific motion pattern was extracted by principal component analysis and slow feature analysis (SFA). The tracking finds the most similar frame (or indexed frame) by a k-dimensional-tree-based nearest neighbor search for estimating the tracked object location. A template image was updated dynamically through the indexed frame to perform a fast template matching (TM) within a learned smaller search region on the incoming frame. The mean tracking error between manually annotated landmarks and the location extracted from the indexed training frame is 1.80 ± 1.42 mm. Adding a fast TM procedure within a small search region reduces the mean tracking error to 1.14 ± 1.16 mm. The tracking time per frame is 15 ms, which is well below the frame acquisition time. Furthermore, the anatomical reproducibility was measured by analyzing the location's anatomical landmark relative to the probe; the position-controlled probe has better reproducibility and yields a smaller mean error across all three volunteer cases, compared to the force-controlled probe (2.69 versus 11.20 mm in the superior-inferior direction and 1.19 versus 8.21 mm in the anterior-posterior direction). Our method reduces the processing time for tracking respiratory motion significantly, which can reduce the delivery uncertainty.

KW - principal component analysis

KW - slow feature analysis

KW - template matching

KW - tracking

KW - ultrasound

UR - http://www.scopus.com/inward/record.url?scp=85072134186&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85072134186&partnerID=8YFLogxK

U2 - 10.1088/1361-6560/ab33db

DO - 10.1088/1361-6560/ab33db

M3 - Article

C2 - 31323649

AN - SCOPUS:85072134186

VL - 64

JO - Physics in Medicine and Biology

JF - Physics in Medicine and Biology

SN - 0031-9155

IS - 18

M1 - 185006

ER -