TY - GEN
T1 - Toward real-time 3D ultrasound registration-based visual servoing for interventional navigation
AU - Zettinig, Oliver
AU - Fuerst, Bernhard
AU - Kojcev, Risto
AU - Esposito, Marco
AU - Salehi, Mehrdad
AU - Wein, Wolfgang
AU - Rackerseder, Julia
AU - Sinibaldi, Edoardo
AU - Frisch, Benjamin
AU - Navab, Nassir
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/6/8
Y1 - 2016/6/8
N2 - While intraoperative imaging is commonly used to guide surgical interventions, automatic robotic support for image-guided navigation has not yet been established in clinical routine. In this paper, we propose a novel visual servoing framework that combines, for the first time, full image-based 3D ultrasound registration with a real-time servo-control scheme. Paired with multi-modal fusion to a pre-interventional plan such as an annotated needle insertion path, it thus allows tracking a target anatomy, continuously updating the plan as the target moves, and keeping a needle guide aligned for accurate manual insertion. The presented system includes a motorized 3D ultrasound transducer mounted on a force-controlled robot and a GPU-based image processing toolkit. The tracking accuracy of our framework is validated on a geometric agar/gelatin phantom using a second robot, achieving positioning errors of on average 0.42-0.44 mm. With compounding and registration runtimes of up to total around 550 ms, real-time performance comes into reach. We also present initial results on a spine phantom, demonstrating the feasibility of our system for lumbar spine injections.
AB - While intraoperative imaging is commonly used to guide surgical interventions, automatic robotic support for image-guided navigation has not yet been established in clinical routine. In this paper, we propose a novel visual servoing framework that combines, for the first time, full image-based 3D ultrasound registration with a real-time servo-control scheme. Paired with multi-modal fusion to a pre-interventional plan such as an annotated needle insertion path, it thus allows tracking a target anatomy, continuously updating the plan as the target moves, and keeping a needle guide aligned for accurate manual insertion. The presented system includes a motorized 3D ultrasound transducer mounted on a force-controlled robot and a GPU-based image processing toolkit. The tracking accuracy of our framework is validated on a geometric agar/gelatin phantom using a second robot, achieving positioning errors of on average 0.42-0.44 mm. With compounding and registration runtimes of up to total around 550 ms, real-time performance comes into reach. We also present initial results on a spine phantom, demonstrating the feasibility of our system for lumbar spine injections.
UR - http://www.scopus.com/inward/record.url?scp=84977598967&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84977598967&partnerID=8YFLogxK
U2 - 10.1109/ICRA.2016.7487226
DO - 10.1109/ICRA.2016.7487226
M3 - Conference contribution
AN - SCOPUS:84977598967
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 945
EP - 950
BT - 2016 IEEE International Conference on Robotics and Automation, ICRA 2016
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2016 IEEE International Conference on Robotics and Automation, ICRA 2016
Y2 - 16 May 2016 through 21 May 2016
ER -