TY - GEN
T1 - HARMONIE
T2 - 2013 6th International IEEE EMBS Conference on Neural Engineering, NER 2013
AU - Katyal, Kapil D.
AU - Johannes, Matthew S.
AU - McGee, Timothy G.
AU - Harris, Andrew J.
AU - Armiger, Robert S.
AU - Firpi, Alex H.
AU - McMullen, David
AU - Hotson, Guy
AU - Fifer, Matthew S.
AU - Crone, Nathan E.
AU - Vogelstein, R. Jacob
AU - Wester, Brock A.
PY - 2013/12/1
Y1 - 2013/12/1
N2 - Effective user control of highly dexterous and robotic assistive devices requires intuitive and natural modalities. Although surgically implanted brain-computer interfaces (BCIs) strive to achieve this, a number of non-invasive engineering solutions may provide a quicker path to patient use by eliminating surgical implantation. We present the development of a semi-autonomous control system that utilizes computer vision, prosthesis feedback, effector centric device control, smooth movement trajectories, and appropriate hand conformations to interact with objects of interest. Users can direct a prosthetic limb through an intuitive graphical user interface to complete multi-stage tasks using patient appropriate combinations of control inputs such as eye tracking, conventional prosthetic controls/joysticks, surface electromyography (sEMG) signals, and neural interfaces (ECoG, EEG). Aligned with activities of daily living (ADL), these tasks include directing the prosthetic to specific locations or objects, grasping of objects by modulating hand conformation, and action upon grasped objects such as self-feeding. This Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE) semi-autonomous control system lowers the user's cognitive load, leaving the bulk of command and control of the device to the computer. This flexible and intuitive control system could serve patient populations ranging from wheelchair-bound quadriplegics to upper-limb amputees.
AB - Effective user control of highly dexterous and robotic assistive devices requires intuitive and natural modalities. Although surgically implanted brain-computer interfaces (BCIs) strive to achieve this, a number of non-invasive engineering solutions may provide a quicker path to patient use by eliminating surgical implantation. We present the development of a semi-autonomous control system that utilizes computer vision, prosthesis feedback, effector centric device control, smooth movement trajectories, and appropriate hand conformations to interact with objects of interest. Users can direct a prosthetic limb through an intuitive graphical user interface to complete multi-stage tasks using patient appropriate combinations of control inputs such as eye tracking, conventional prosthetic controls/joysticks, surface electromyography (sEMG) signals, and neural interfaces (ECoG, EEG). Aligned with activities of daily living (ADL), these tasks include directing the prosthetic to specific locations or objects, grasping of objects by modulating hand conformation, and action upon grasped objects such as self-feeding. This Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE) semi-autonomous control system lowers the user's cognitive load, leaving the bulk of command and control of the device to the computer. This flexible and intuitive control system could serve patient populations ranging from wheelchair-bound quadriplegics to upper-limb amputees.
KW - Assistive robotics
KW - Brain-computer interface
KW - Brain-machine interface
KW - Computer vision
KW - Hybrid BCI/BMI
KW - Intelligent robotics
KW - Modular prosthetic limb
KW - Neural prosthetic system
KW - Prosthetics
KW - Robotic limb
KW - Semi-autonomous
UR - http://www.scopus.com/inward/record.url?scp=84897704612&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84897704612&partnerID=8YFLogxK
U2 - 10.1109/NER.2013.6696173
DO - 10.1109/NER.2013.6696173
M3 - Conference contribution
AN - SCOPUS:84897704612
SN - 9781467319690
T3 - International IEEE/EMBS Conference on Neural Engineering, NER
SP - 1274
EP - 1278
BT - 2013 6th International IEEE EMBS Conference on Neural Engineering, NER 2013
Y2 - 6 November 2013 through 8 November 2013
ER -