Demonstration of a semi-autonomous hybrid brain-machine interface using human intracranial EEG, eye tracking, and computer vision to control a robotic upper limb prosthetic

David P. McMullen, Guy Hotson, Kapil D. Katyal, Brock A. Wester, Matthew S. Fifer, Timothy G. McGee, Andrew Harris, Matthew S. Johannes, R. Jacob Vogelstein, Alan D. Ravitz, William S Anderson, Nitish V Thakor, Nathan E Crone

Research output: Contribution to journalArticle

Abstract

To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocorticographic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4% (20/28) and 67.7% (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1% and 92.9%, significantly greater than chance accuracies (p <0.05). After BMI-based initiation, the MPL completed the entire task 100% (one object) and 70% (three objects) of the time. The MPL took approximately 12.2 s for task completion after system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs.

Original languageEnglish (US)
Article number6683036
Pages (from-to)784-796
Number of pages13
JournalIEEE Transactions on Neural Systems and Rehabilitation Engineering
Volume22
Issue number4
DOIs
StatePublished - 2014

Fingerprint

Brain-Computer Interfaces
Robotics
Electroencephalography
Prosthetics
Upper Extremity
Computer vision
Brain
Demonstrations
Extremities
Augmented reality
Motor Cortex
Hand Strength
Hybrid systems
Prostheses and Implants
Electrodes
Electrocorticography

Keywords

  • Brain-computer interface (BCI)
  • brain-machine interface (BMI)
  • electrocorticography (ECoG)
  • hybrid BCI
  • intelligent robotics
  • intracranial EEG (iEEG)

ASJC Scopus subject areas

  • Neuroscience(all)
  • Computer Science Applications
  • Biomedical Engineering
  • Medicine(all)

Cite this

Demonstration of a semi-autonomous hybrid brain-machine interface using human intracranial EEG, eye tracking, and computer vision to control a robotic upper limb prosthetic. / McMullen, David P.; Hotson, Guy; Katyal, Kapil D.; Wester, Brock A.; Fifer, Matthew S.; McGee, Timothy G.; Harris, Andrew; Johannes, Matthew S.; Vogelstein, R. Jacob; Ravitz, Alan D.; Anderson, William S; Thakor, Nitish V; Crone, Nathan E.

In: IEEE Transactions on Neural Systems and Rehabilitation Engineering, Vol. 22, No. 4, 6683036, 2014, p. 784-796.

Research output: Contribution to journalArticle

McMullen, David P. ; Hotson, Guy ; Katyal, Kapil D. ; Wester, Brock A. ; Fifer, Matthew S. ; McGee, Timothy G. ; Harris, Andrew ; Johannes, Matthew S. ; Vogelstein, R. Jacob ; Ravitz, Alan D. ; Anderson, William S ; Thakor, Nitish V ; Crone, Nathan E. / Demonstration of a semi-autonomous hybrid brain-machine interface using human intracranial EEG, eye tracking, and computer vision to control a robotic upper limb prosthetic. In: IEEE Transactions on Neural Systems and Rehabilitation Engineering. 2014 ; Vol. 22, No. 4. pp. 784-796.
@article{74545a81fdda4520b0c76b215fbd78cb,
title = "Demonstration of a semi-autonomous hybrid brain-machine interface using human intracranial EEG, eye tracking, and computer vision to control a robotic upper limb prosthetic",
abstract = "To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocorticographic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4{\%} (20/28) and 67.7{\%} (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1{\%} and 92.9{\%}, significantly greater than chance accuracies (p <0.05). After BMI-based initiation, the MPL completed the entire task 100{\%} (one object) and 70{\%} (three objects) of the time. The MPL took approximately 12.2 s for task completion after system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs.",
keywords = "Brain-computer interface (BCI), brain-machine interface (BMI), electrocorticography (ECoG), hybrid BCI, intelligent robotics, intracranial EEG (iEEG)",
author = "McMullen, {David P.} and Guy Hotson and Katyal, {Kapil D.} and Wester, {Brock A.} and Fifer, {Matthew S.} and McGee, {Timothy G.} and Andrew Harris and Johannes, {Matthew S.} and Vogelstein, {R. Jacob} and Ravitz, {Alan D.} and Anderson, {William S} and Thakor, {Nitish V} and Crone, {Nathan E}",
year = "2014",
doi = "10.1109/TNSRE.2013.2294685",
language = "English (US)",
volume = "22",
pages = "784--796",
journal = "IEEE Transactions on Neural Systems and Rehabilitation Engineering",
issn = "1534-4320",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "4",

}

TY - JOUR

T1 - Demonstration of a semi-autonomous hybrid brain-machine interface using human intracranial EEG, eye tracking, and computer vision to control a robotic upper limb prosthetic

AU - McMullen, David P.

AU - Hotson, Guy

AU - Katyal, Kapil D.

AU - Wester, Brock A.

AU - Fifer, Matthew S.

AU - McGee, Timothy G.

AU - Harris, Andrew

AU - Johannes, Matthew S.

AU - Vogelstein, R. Jacob

AU - Ravitz, Alan D.

AU - Anderson, William S

AU - Thakor, Nitish V

AU - Crone, Nathan E

PY - 2014

Y1 - 2014

N2 - To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocorticographic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4% (20/28) and 67.7% (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1% and 92.9%, significantly greater than chance accuracies (p <0.05). After BMI-based initiation, the MPL completed the entire task 100% (one object) and 70% (three objects) of the time. The MPL took approximately 12.2 s for task completion after system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs.

AB - To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocorticographic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4% (20/28) and 67.7% (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1% and 92.9%, significantly greater than chance accuracies (p <0.05). After BMI-based initiation, the MPL completed the entire task 100% (one object) and 70% (three objects) of the time. The MPL took approximately 12.2 s for task completion after system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs.

KW - Brain-computer interface (BCI)

KW - brain-machine interface (BMI)

KW - electrocorticography (ECoG)

KW - hybrid BCI

KW - intelligent robotics

KW - intracranial EEG (iEEG)

UR - http://www.scopus.com/inward/record.url?scp=84904284469&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84904284469&partnerID=8YFLogxK

U2 - 10.1109/TNSRE.2013.2294685

DO - 10.1109/TNSRE.2013.2294685

M3 - Article

VL - 22

SP - 784

EP - 796

JO - IEEE Transactions on Neural Systems and Rehabilitation Engineering

JF - IEEE Transactions on Neural Systems and Rehabilitation Engineering

SN - 1534-4320

IS - 4

M1 - 6683036

ER -