TY - GEN
T1 - Unsupervised surgical data alignment with application to automatic activity annotation
AU - Gao, Yixin
AU - Vedula, S. Swaroop
AU - Lee, Gyusung
AU - Lee, Mija R.
AU - Khudanpur, Sanjeev
AU - Hager, Gregory D.
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/6/8
Y1 - 2016/6/8
N2 - Robotic surgery and other minimally-invasive surgical techniques are an integral part of patient care, and readily yield large amounts of data. Surgical tool motion (kinematic data) contains information that is useful for assessment and education. Typically, assessment and education tools that rely upon the kinematic data require substantial manual processing such as activity annotations. The goal of this paper was to develop an automated method to align surgical recordings and assign activity annotations. We developed an approach based on unsupervised alignment to efficient annotate kinematic data for its constituent activity segments. Our method includes extracting non-linear features from the kinematic data using a stacked de-noising autoencoder, and using modified dynamic time warping to align the kinematic data from different trials of the study task. We combined alignment between a test and one or a small set of template trials (with prior manual annotations) with voting based on kernel density estimation to transfer labels from the template to the test trial. Our experiments on performance of this method using two datasets captured in the training laboratory demonstrate an accuracy of 72% to 94% for annotating activity segments within a surgical training task. Our findings are robust to data captured from several surgeons, and to deviations in activity from a canonical activity sequence.
AB - Robotic surgery and other minimally-invasive surgical techniques are an integral part of patient care, and readily yield large amounts of data. Surgical tool motion (kinematic data) contains information that is useful for assessment and education. Typically, assessment and education tools that rely upon the kinematic data require substantial manual processing such as activity annotations. The goal of this paper was to develop an automated method to align surgical recordings and assign activity annotations. We developed an approach based on unsupervised alignment to efficient annotate kinematic data for its constituent activity segments. Our method includes extracting non-linear features from the kinematic data using a stacked de-noising autoencoder, and using modified dynamic time warping to align the kinematic data from different trials of the study task. We combined alignment between a test and one or a small set of template trials (with prior manual annotations) with voting based on kernel density estimation to transfer labels from the template to the test trial. Our experiments on performance of this method using two datasets captured in the training laboratory demonstrate an accuracy of 72% to 94% for annotating activity segments within a surgical training task. Our findings are robust to data captured from several surgeons, and to deviations in activity from a canonical activity sequence.
UR - http://www.scopus.com/inward/record.url?scp=84977550696&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84977550696&partnerID=8YFLogxK
U2 - 10.1109/ICRA.2016.7487608
DO - 10.1109/ICRA.2016.7487608
M3 - Conference contribution
AN - SCOPUS:84977550696
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 4158
EP - 4163
BT - 2016 IEEE International Conference on Robotics and Automation, ICRA 2016
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2016 IEEE International Conference on Robotics and Automation, ICRA 2016
Y2 - 16 May 2016 through 21 May 2016
ER -