System events: readily accessible features for surgical phase detection

Anand Malpani, Colin Lea, Chi Chiung Grace Chen, Gregory Hager

Research output: Contribution to journalArticle


Purpose: Surgical phase recognition using sensor data is challenging due to high variation in patient anatomy and surgeon-specific operating styles. Segmenting surgical procedures into constituent phases is of significant utility for resident training, education, self-review, and context-aware operating room technologies. Phase annotation is a highly labor-intensive task and would benefit greatly from automated solutions. Methods: We propose a novel approach using system events—for example, activation of cautery tools—that are easily captured in most surgical procedures. Our method involves extracting event-based features over 90-s intervals and assigning a phase label to each interval. We explore three classification techniques: support vector machines, random forests, and temporal convolution neural networks. Each of these models independently predicts a label for each time interval. We also examine segmental inference using an approach based on the semi-Markov conditional random field, which jointly performs phase segmentation and classification. Our method is evaluated on a data set of 24 robot-assisted hysterectomy procedures. Results: Our framework is able to detect surgical phases with an accuracy of 74 % using event-based features over a set of five different phases—ligation, dissection, colpotomy, cuff closure, and background. Precision and recall values for the cuff closure (Precision: 83 %, Recall: 98 %) and dissection (Precision: 75 %, Recall: 88 %) classes were higher than other classes. The normalized Levenshtein distance between predicted and ground truth phase sequence was 25 %. Conclusions: Our findings demonstrate that system events features are useful for automatically detecting surgical phase. Events contain phase information that cannot be obtained from motion data and that would require advanced computer vision algorithms to extract from a video. Many of these events are not specific to robotic surgery and can easily be recorded in non-robotic surgical modalities. In future work, we plan to combine information from system events, tool motion, and videos to automate phase detection in surgical procedures.

Original languageEnglish (US)
Pages (from-to)1-9
Number of pages9
JournalInternational journal of computer assisted radiology and surgery
StateAccepted/In press - May 13 2016


  • Robot-assisted surgery
  • Sensor data
  • Surgical phase detection
  • Surgical process modeling
  • Surgical task flow
  • Surgical workflow analysis
  • System events

ASJC Scopus subject areas

  • Radiology Nuclear Medicine and imaging
  • Health Informatics
  • Surgery

Fingerprint Dive into the research topics of 'System events: readily accessible features for surgical phase detection'. Together they form a unique fingerprint.

  • Cite this