Feasibility of a touch-free user interface for ultrasound snapshot-guided nephrostomy

Simon Kotwicz Herniczek, Andras Lasso, Tamas Ungi, Gabor Fichtinger

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Purpose: Clinicians are often required to interact with visualization software during image-guided medical interventions, but sterility requirements forbid the use of traditional keyboard and mouse devices. In this study we attempt to determine the feasibility of using a touch-free interface in a real time procedure by creating a full gesture-based guidance module for ultrasound snapshot-guided percutaneous nephrostomy. Methods: The workflow for this procedure required a gesture to select between two options, a "back" and "next" gesture, a "reset" gesture, and a way to mark a point on an image. Using an orientation sensor mounted on the hand as input device, gesture recognition software was developed based on hand orientation changes. Five operators were recruited to train the developed gesture recognition software. The participants performed each gesture ten times and placed three points on predefined target positions. They also performed tasks unrelated to the sought-after gestures to evaluate the specificity of the gesture recognition. The orientation sensor measurements and the position of the marked points were recorded. The recorded data sets were used to establish threshold values and optimize the gesture recognition algorithm. RESULTS: For the "back", "reset" and "select option" gesture, a 100% recognition accuracy was achieved. For the "next" gesture, a 92% recognition accuracy was obtained. With the optimized gesture recognition software no misclassified gestures were observed when testing the individual gestures or when performing actions unrelated to the sought-after gestures. The mean point placement error was 0.55 mm with a standard deviation of 0.30 mm. The mean placement time was 4.8 seconds. CONCLUSION: The system that was developed is promising and demonstrates potential for touch-free interfaces in the operating room.

Original languageEnglish (US)
Title of host publicationProgress in Biomedical Optics and Imaging - Proceedings of SPIE
PublisherSPIE
Volume9036
ISBN (Print)9780819498298
DOIs
StatePublished - 2014
Externally publishedYes
EventMedical Imaging 2014: Image-Guided Procedures, Robotic Interventions, and Modeling - San Diego, CA, United States
Duration: Feb 18 2014Feb 20 2014

Other

OtherMedical Imaging 2014: Image-Guided Procedures, Robotic Interventions, and Modeling
CountryUnited States
CitySan Diego, CA
Period2/18/142/20/14

Fingerprint

Gesture recognition
Gestures
touch
Touch
User interfaces
Ultrasonics
computer programs
Operating rooms
Sensors
Software
sensors
Visualization
rooms
mice
standard deviation
modules
Testing
Hand
operators
requirements

Keywords

  • gesture recognition
  • image-guided procedure
  • nephrostomy
  • operating room interface
  • touch-free interface
  • ultrasound-guided procedure

ASJC Scopus subject areas

  • Atomic and Molecular Physics, and Optics
  • Electronic, Optical and Magnetic Materials
  • Biomaterials
  • Radiology Nuclear Medicine and imaging

Cite this

Herniczek, S. K., Lasso, A., Ungi, T., & Fichtinger, G. (2014). Feasibility of a touch-free user interface for ultrasound snapshot-guided nephrostomy. In Progress in Biomedical Optics and Imaging - Proceedings of SPIE (Vol. 9036). [90362F] SPIE. https://doi.org/10.1117/12.2043564

Feasibility of a touch-free user interface for ultrasound snapshot-guided nephrostomy. / Herniczek, Simon Kotwicz; Lasso, Andras; Ungi, Tamas; Fichtinger, Gabor.

Progress in Biomedical Optics and Imaging - Proceedings of SPIE. Vol. 9036 SPIE, 2014. 90362F.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Herniczek, SK, Lasso, A, Ungi, T & Fichtinger, G 2014, Feasibility of a touch-free user interface for ultrasound snapshot-guided nephrostomy. in Progress in Biomedical Optics and Imaging - Proceedings of SPIE. vol. 9036, 90362F, SPIE, Medical Imaging 2014: Image-Guided Procedures, Robotic Interventions, and Modeling, San Diego, CA, United States, 2/18/14. https://doi.org/10.1117/12.2043564
Herniczek SK, Lasso A, Ungi T, Fichtinger G. Feasibility of a touch-free user interface for ultrasound snapshot-guided nephrostomy. In Progress in Biomedical Optics and Imaging - Proceedings of SPIE. Vol. 9036. SPIE. 2014. 90362F https://doi.org/10.1117/12.2043564
Herniczek, Simon Kotwicz ; Lasso, Andras ; Ungi, Tamas ; Fichtinger, Gabor. / Feasibility of a touch-free user interface for ultrasound snapshot-guided nephrostomy. Progress in Biomedical Optics and Imaging - Proceedings of SPIE. Vol. 9036 SPIE, 2014.
@inproceedings{65400370551e405aa892441f1d51a73d,
title = "Feasibility of a touch-free user interface for ultrasound snapshot-guided nephrostomy",
abstract = "Purpose: Clinicians are often required to interact with visualization software during image-guided medical interventions, but sterility requirements forbid the use of traditional keyboard and mouse devices. In this study we attempt to determine the feasibility of using a touch-free interface in a real time procedure by creating a full gesture-based guidance module for ultrasound snapshot-guided percutaneous nephrostomy. Methods: The workflow for this procedure required a gesture to select between two options, a {"}back{"} and {"}next{"} gesture, a {"}reset{"} gesture, and a way to mark a point on an image. Using an orientation sensor mounted on the hand as input device, gesture recognition software was developed based on hand orientation changes. Five operators were recruited to train the developed gesture recognition software. The participants performed each gesture ten times and placed three points on predefined target positions. They also performed tasks unrelated to the sought-after gestures to evaluate the specificity of the gesture recognition. The orientation sensor measurements and the position of the marked points were recorded. The recorded data sets were used to establish threshold values and optimize the gesture recognition algorithm. RESULTS: For the {"}back{"}, {"}reset{"} and {"}select option{"} gesture, a 100{\%} recognition accuracy was achieved. For the {"}next{"} gesture, a 92{\%} recognition accuracy was obtained. With the optimized gesture recognition software no misclassified gestures were observed when testing the individual gestures or when performing actions unrelated to the sought-after gestures. The mean point placement error was 0.55 mm with a standard deviation of 0.30 mm. The mean placement time was 4.8 seconds. CONCLUSION: The system that was developed is promising and demonstrates potential for touch-free interfaces in the operating room.",
keywords = "gesture recognition, image-guided procedure, nephrostomy, operating room interface, touch-free interface, ultrasound-guided procedure",
author = "Herniczek, {Simon Kotwicz} and Andras Lasso and Tamas Ungi and Gabor Fichtinger",
year = "2014",
doi = "10.1117/12.2043564",
language = "English (US)",
isbn = "9780819498298",
volume = "9036",
booktitle = "Progress in Biomedical Optics and Imaging - Proceedings of SPIE",
publisher = "SPIE",

}

TY - GEN

T1 - Feasibility of a touch-free user interface for ultrasound snapshot-guided nephrostomy

AU - Herniczek, Simon Kotwicz

AU - Lasso, Andras

AU - Ungi, Tamas

AU - Fichtinger, Gabor

PY - 2014

Y1 - 2014

N2 - Purpose: Clinicians are often required to interact with visualization software during image-guided medical interventions, but sterility requirements forbid the use of traditional keyboard and mouse devices. In this study we attempt to determine the feasibility of using a touch-free interface in a real time procedure by creating a full gesture-based guidance module for ultrasound snapshot-guided percutaneous nephrostomy. Methods: The workflow for this procedure required a gesture to select between two options, a "back" and "next" gesture, a "reset" gesture, and a way to mark a point on an image. Using an orientation sensor mounted on the hand as input device, gesture recognition software was developed based on hand orientation changes. Five operators were recruited to train the developed gesture recognition software. The participants performed each gesture ten times and placed three points on predefined target positions. They also performed tasks unrelated to the sought-after gestures to evaluate the specificity of the gesture recognition. The orientation sensor measurements and the position of the marked points were recorded. The recorded data sets were used to establish threshold values and optimize the gesture recognition algorithm. RESULTS: For the "back", "reset" and "select option" gesture, a 100% recognition accuracy was achieved. For the "next" gesture, a 92% recognition accuracy was obtained. With the optimized gesture recognition software no misclassified gestures were observed when testing the individual gestures or when performing actions unrelated to the sought-after gestures. The mean point placement error was 0.55 mm with a standard deviation of 0.30 mm. The mean placement time was 4.8 seconds. CONCLUSION: The system that was developed is promising and demonstrates potential for touch-free interfaces in the operating room.

AB - Purpose: Clinicians are often required to interact with visualization software during image-guided medical interventions, but sterility requirements forbid the use of traditional keyboard and mouse devices. In this study we attempt to determine the feasibility of using a touch-free interface in a real time procedure by creating a full gesture-based guidance module for ultrasound snapshot-guided percutaneous nephrostomy. Methods: The workflow for this procedure required a gesture to select between two options, a "back" and "next" gesture, a "reset" gesture, and a way to mark a point on an image. Using an orientation sensor mounted on the hand as input device, gesture recognition software was developed based on hand orientation changes. Five operators were recruited to train the developed gesture recognition software. The participants performed each gesture ten times and placed three points on predefined target positions. They also performed tasks unrelated to the sought-after gestures to evaluate the specificity of the gesture recognition. The orientation sensor measurements and the position of the marked points were recorded. The recorded data sets were used to establish threshold values and optimize the gesture recognition algorithm. RESULTS: For the "back", "reset" and "select option" gesture, a 100% recognition accuracy was achieved. For the "next" gesture, a 92% recognition accuracy was obtained. With the optimized gesture recognition software no misclassified gestures were observed when testing the individual gestures or when performing actions unrelated to the sought-after gestures. The mean point placement error was 0.55 mm with a standard deviation of 0.30 mm. The mean placement time was 4.8 seconds. CONCLUSION: The system that was developed is promising and demonstrates potential for touch-free interfaces in the operating room.

KW - gesture recognition

KW - image-guided procedure

KW - nephrostomy

KW - operating room interface

KW - touch-free interface

KW - ultrasound-guided procedure

UR - http://www.scopus.com/inward/record.url?scp=84902204743&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84902204743&partnerID=8YFLogxK

U2 - 10.1117/12.2043564

DO - 10.1117/12.2043564

M3 - Conference contribution

AN - SCOPUS:84902204743

SN - 9780819498298

VL - 9036

BT - Progress in Biomedical Optics and Imaging - Proceedings of SPIE

PB - SPIE

ER -