Feasibility of a touch-free user interface for ultrasound snapshot-guided nephrostomy

Simon Kotwicz Herniczek, Andras Lasso, Tamas Ungi, Gabor Fichtinger

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

Purpose: Clinicians are often required to interact with visualization software during image-guided medical interventions, but sterility requirements forbid the use of traditional keyboard and mouse devices. In this study we attempt to determine the feasibility of using a touch-free interface in a real time procedure by creating a full gesture-based guidance module for ultrasound snapshot-guided percutaneous nephrostomy. Methods: The workflow for this procedure required a gesture to select between two options, a "back" and "next" gesture, a "reset" gesture, and a way to mark a point on an image. Using an orientation sensor mounted on the hand as input device, gesture recognition software was developed based on hand orientation changes. Five operators were recruited to train the developed gesture recognition software. The participants performed each gesture ten times and placed three points on predefined target positions. They also performed tasks unrelated to the sought-after gestures to evaluate the specificity of the gesture recognition. The orientation sensor measurements and the position of the marked points were recorded. The recorded data sets were used to establish threshold values and optimize the gesture recognition algorithm. RESULTS: For the "back", "reset" and "select option" gesture, a 100% recognition accuracy was achieved. For the "next" gesture, a 92% recognition accuracy was obtained. With the optimized gesture recognition software no misclassified gestures were observed when testing the individual gestures or when performing actions unrelated to the sought-after gestures. The mean point placement error was 0.55 mm with a standard deviation of 0.30 mm. The mean placement time was 4.8 seconds. CONCLUSION: The system that was developed is promising and demonstrates potential for touch-free interfaces in the operating room.

Original languageEnglish (US)
Title of host publicationProgress in Biomedical Optics and Imaging - Proceedings of SPIE
PublisherSPIE
Volume9036
ISBN (Print)9780819498298
DOIs
StatePublished - 2014
Externally publishedYes
EventMedical Imaging 2014: Image-Guided Procedures, Robotic Interventions, and Modeling - San Diego, CA, United States
Duration: Feb 18 2014Feb 20 2014

Other

OtherMedical Imaging 2014: Image-Guided Procedures, Robotic Interventions, and Modeling
Country/TerritoryUnited States
CitySan Diego, CA
Period2/18/142/20/14

Keywords

  • gesture recognition
  • image-guided procedure
  • nephrostomy
  • operating room interface
  • touch-free interface
  • ultrasound-guided procedure

ASJC Scopus subject areas

  • Atomic and Molecular Physics, and Optics
  • Electronic, Optical and Magnetic Materials
  • Biomaterials
  • Radiology Nuclear Medicine and imaging

Fingerprint

Dive into the research topics of 'Feasibility of a touch-free user interface for ultrasound snapshot-guided nephrostomy'. Together they form a unique fingerprint.

Cite this