Towards automatic skill evaluation: Detection and segmentation of robot-assisted surgical motions

Henry C. Lin, Izhak Shafran, David Yuh, Gregory D. Hager

Research output: Contribution to journalArticle

Abstract

This paper reports our progress in developing techniques for "parsing" raw motion data from a simple surgical task into a labeled sequence of surgical gestures. The ability to automatically detect and segment surgical motion can be useful in evaluating surgical skill, providing surgical training feedback, or documenting essential aspects of a procedure. If processed online, the information can be used to provide context-specific information or motion enhancements to the surgeon. However, in every case, the key step is to relate recorded motion data to a model of the procedure being performed. Robotic surgical systems such as the da Vinci system from Intuitive Surgical provide a rich source of motion and video data from surgical procedures. The application programming interface (API) of the da Vinci outputs 192 kinematics values at 10 Hz. Through a series of feature-processing steps, tailored to this task, the highly redundant features are projected to a compact and discriminative space. The resulting classifier is simple and effective. Cross-validation experiments show that the proposed approach can achieve accuracies higher than 90% when segmenting gestures in a 4-throw suturing task, for both expert and intermediate surgeons. These preliminary results suggest that gesture-specific features can be extracted to provide highly accurate surgical skill evaluation.

Original languageEnglish (US)
Pages (from-to)220-230
Number of pages11
JournalComputer Aided Surgery
Volume11
Issue number5
DOIs
StatePublished - Sep 1 2006

Keywords

  • Robotic surgery
  • Surgical modeling
  • Surgical skill evaluation
  • Surgical training

ASJC Scopus subject areas

  • Surgery
  • Radiology Nuclear Medicine and imaging

Fingerprint Dive into the research topics of 'Towards automatic skill evaluation: Detection and segmentation of robot-assisted surgical motions'. Together they form a unique fingerprint.

  • Cite this