Unified detection and tracking in retinal microsurgery.

Raphael Sznitman, Anasuya Basu, Rogerio Richa, James Handa, Peter Gehlbach, Russell H Taylor, Bruno Jedynak, Gregory Hager

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

Traditionally, tool tracking involves two subtasks: (i) detecting the tool in the initial image in which it appears, and (ii) predicting and refining the configuration of the detected tool in subsequent images. With retinal microsurgery in mind, we propose a unified tool detection and tracking framework, removing the need for two separate systems. The basis of our approach is to treat both detection and tracking as a sequential entropy minimization problem, where the goal is to determine the parameters describing a surgical tool in each frame. The resulting framework is capable of both detecting and tracking in situations where the tool enters and leaves the field of view regularly. We demonstrate the benefits of this method in the context of retinal tool tracking. Through extensive experimentation on a phantom eye, we show that this method provides efficient and robust tool tracking and detection.

Original languageEnglish (US)
Title of host publicationMedical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention
Pages1-8
Number of pages8
Volume14
EditionPt 1
StatePublished - 2011
Externally publishedYes

Fingerprint

Microsurgery
Entropy

ASJC Scopus subject areas

  • Medicine(all)

Cite this

Sznitman, R., Basu, A., Richa, R., Handa, J., Gehlbach, P., Taylor, R. H., ... Hager, G. (2011). Unified detection and tracking in retinal microsurgery. In Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention (Pt 1 ed., Vol. 14, pp. 1-8)

Unified detection and tracking in retinal microsurgery. / Sznitman, Raphael; Basu, Anasuya; Richa, Rogerio; Handa, James; Gehlbach, Peter; Taylor, Russell H; Jedynak, Bruno; Hager, Gregory.

Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention. Vol. 14 Pt 1. ed. 2011. p. 1-8.

Research output: Chapter in Book/Report/Conference proceedingChapter

Sznitman, R, Basu, A, Richa, R, Handa, J, Gehlbach, P, Taylor, RH, Jedynak, B & Hager, G 2011, Unified detection and tracking in retinal microsurgery. in Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention. Pt 1 edn, vol. 14, pp. 1-8.
Sznitman R, Basu A, Richa R, Handa J, Gehlbach P, Taylor RH et al. Unified detection and tracking in retinal microsurgery. In Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention. Pt 1 ed. Vol. 14. 2011. p. 1-8
Sznitman, Raphael ; Basu, Anasuya ; Richa, Rogerio ; Handa, James ; Gehlbach, Peter ; Taylor, Russell H ; Jedynak, Bruno ; Hager, Gregory. / Unified detection and tracking in retinal microsurgery. Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention. Vol. 14 Pt 1. ed. 2011. pp. 1-8
@inbook{7d948ff0df1942a5bd28b5d1f127196e,
title = "Unified detection and tracking in retinal microsurgery.",
abstract = "Traditionally, tool tracking involves two subtasks: (i) detecting the tool in the initial image in which it appears, and (ii) predicting and refining the configuration of the detected tool in subsequent images. With retinal microsurgery in mind, we propose a unified tool detection and tracking framework, removing the need for two separate systems. The basis of our approach is to treat both detection and tracking as a sequential entropy minimization problem, where the goal is to determine the parameters describing a surgical tool in each frame. The resulting framework is capable of both detecting and tracking in situations where the tool enters and leaves the field of view regularly. We demonstrate the benefits of this method in the context of retinal tool tracking. Through extensive experimentation on a phantom eye, we show that this method provides efficient and robust tool tracking and detection.",
author = "Raphael Sznitman and Anasuya Basu and Rogerio Richa and James Handa and Peter Gehlbach and Taylor, {Russell H} and Bruno Jedynak and Gregory Hager",
year = "2011",
language = "English (US)",
volume = "14",
pages = "1--8",
booktitle = "Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention",
edition = "Pt 1",

}

TY - CHAP

T1 - Unified detection and tracking in retinal microsurgery.

AU - Sznitman, Raphael

AU - Basu, Anasuya

AU - Richa, Rogerio

AU - Handa, James

AU - Gehlbach, Peter

AU - Taylor, Russell H

AU - Jedynak, Bruno

AU - Hager, Gregory

PY - 2011

Y1 - 2011

N2 - Traditionally, tool tracking involves two subtasks: (i) detecting the tool in the initial image in which it appears, and (ii) predicting and refining the configuration of the detected tool in subsequent images. With retinal microsurgery in mind, we propose a unified tool detection and tracking framework, removing the need for two separate systems. The basis of our approach is to treat both detection and tracking as a sequential entropy minimization problem, where the goal is to determine the parameters describing a surgical tool in each frame. The resulting framework is capable of both detecting and tracking in situations where the tool enters and leaves the field of view regularly. We demonstrate the benefits of this method in the context of retinal tool tracking. Through extensive experimentation on a phantom eye, we show that this method provides efficient and robust tool tracking and detection.

AB - Traditionally, tool tracking involves two subtasks: (i) detecting the tool in the initial image in which it appears, and (ii) predicting and refining the configuration of the detected tool in subsequent images. With retinal microsurgery in mind, we propose a unified tool detection and tracking framework, removing the need for two separate systems. The basis of our approach is to treat both detection and tracking as a sequential entropy minimization problem, where the goal is to determine the parameters describing a surgical tool in each frame. The resulting framework is capable of both detecting and tracking in situations where the tool enters and leaves the field of view regularly. We demonstrate the benefits of this method in the context of retinal tool tracking. Through extensive experimentation on a phantom eye, we show that this method provides efficient and robust tool tracking and detection.

UR - http://www.scopus.com/inward/record.url?scp=82255183495&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=82255183495&partnerID=8YFLogxK

M3 - Chapter

C2 - 22003593

AN - SCOPUS:82255183495

VL - 14

SP - 1

EP - 8

BT - Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention

ER -