TY - GEN
T1 - A Mixed-Reality Training Environment for Upper Limb Prosthesis Control
AU - Sharma, Avinash
AU - Hunt, Christopher L.
AU - Maheshwari, Asheesh
AU - Osborn, Luke
AU - Levay, Gyorgy
AU - Kaliki, Rahul R.
AU - Soares, Alcimar B.
AU - Thakor, Nitish
N1 - Publisher Copyright:
© 2018 IEEE.
PY - 2018/12/20
Y1 - 2018/12/20
N2 - Adjusting to an amputation can often times be difficult for the body. Post-surgery, amputees not only have to incur expensive rehabilitation treatment costs, but also have to wait for up to several months before receiving a properly fitted prosthesis. We developed a mixed-reality training environment where amputees can train, at their own time and convenience, and interact with holographic objects, while also receiving tactile and proprioceptive feedback. We incorporate positional information through inertial sensors, touch and proprioception information through vibrational feedback, all integrated into an augmented-reality (AR) environment viewed through the Microsoft HoloLens TM. Training tasks were designed to account for limb rotation and object relocation in a three-dimensional space with a correct palm orientation essential for an intuitive grasp and release of objects. Our results showed an improved performance in training time, overshoot and completion rate with vibratory feedback (of both touch and proprioception) over without feedback. Furthermore, EMG activity was analyzed to estimate the muscular effort during each task.
AB - Adjusting to an amputation can often times be difficult for the body. Post-surgery, amputees not only have to incur expensive rehabilitation treatment costs, but also have to wait for up to several months before receiving a properly fitted prosthesis. We developed a mixed-reality training environment where amputees can train, at their own time and convenience, and interact with holographic objects, while also receiving tactile and proprioceptive feedback. We incorporate positional information through inertial sensors, touch and proprioception information through vibrational feedback, all integrated into an augmented-reality (AR) environment viewed through the Microsoft HoloLens TM. Training tasks were designed to account for limb rotation and object relocation in a three-dimensional space with a correct palm orientation essential for an intuitive grasp and release of objects. Our results showed an improved performance in training time, overshoot and completion rate with vibratory feedback (of both touch and proprioception) over without feedback. Furthermore, EMG activity was analyzed to estimate the muscular effort during each task.
UR - http://www.scopus.com/inward/record.url?scp=85060896152&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85060896152&partnerID=8YFLogxK
U2 - 10.1109/BIOCAS.2018.8584739
DO - 10.1109/BIOCAS.2018.8584739
M3 - Conference contribution
AN - SCOPUS:85060896152
T3 - 2018 IEEE Biomedical Circuits and Systems Conference, BioCAS 2018 - Proceedings
BT - 2018 IEEE Biomedical Circuits and Systems Conference, BioCAS 2018 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2018 IEEE Biomedical Circuits and Systems Conference, BioCAS 2018
Y2 - 17 October 2018 through 19 October 2018
ER -