TY - GEN
T1 - Predictive trajectory estimation during rehabilitative tasks in augmented reality using inertial sensors
AU - Hunt, Christopher L.
AU - Sharma, Avinash
AU - Osborn, Luke E.
AU - Kaliki, Rahul R.
AU - Thakor, Nitish V.
N1 - Funding Information:
The authors would like to thank the human subjects who participated in this study; Infinite Biomedical Technologies; the Applied Physics Laboratory; the National Institutes of Health; and The Johns Hopkins University. This was supported in part by the National Institutes of Health under Grants No. T32EB00338312 and R44HD072668.
Publisher Copyright:
© 2018 IEEE.
PY - 2018/12/20
Y1 - 2018/12/20
N2 - This paper presents a wireless kinematic tracking framework used for biomechanical analysis during rehabilitative tasks in augmented and virtual reality. The framework uses low-cost inertial measurement units and exploits the rigid connections of the human skeletal system to provide egocentric position estimates of joints to centimeter accuracy. On-board sensor fusion combines information from three-axis accelerometers, gyroscopes, and magnetometers to provide robust estimates in real-time. Sensor precision and accuracy were validated using the root mean square error of estimated joint angles against ground truth goniometer measurements. The sensor network produced a mean estimate accuracy of 2.81° with 1.06° precision, resulting in a maximum hand tracking error of 7.06 cm. As an application, the network is used to collect kinematic information from an unconstrained object manipulation task in augmented reality, from which dynamic movement primitives are extracted to characterize natural task completion in N = 3 able-bodied human subjects. These primitives are then leveraged for trajectory estimation in both a generalized and a subject-specific scheme resulting in 0.187 cm and 0.161 cm regression accuracy, respectively. Our proposed kinematic tracking network is wireless, accurate, and especially useful for predicting voluntary actuation in virtual and augmented reality applications.
AB - This paper presents a wireless kinematic tracking framework used for biomechanical analysis during rehabilitative tasks in augmented and virtual reality. The framework uses low-cost inertial measurement units and exploits the rigid connections of the human skeletal system to provide egocentric position estimates of joints to centimeter accuracy. On-board sensor fusion combines information from three-axis accelerometers, gyroscopes, and magnetometers to provide robust estimates in real-time. Sensor precision and accuracy were validated using the root mean square error of estimated joint angles against ground truth goniometer measurements. The sensor network produced a mean estimate accuracy of 2.81° with 1.06° precision, resulting in a maximum hand tracking error of 7.06 cm. As an application, the network is used to collect kinematic information from an unconstrained object manipulation task in augmented reality, from which dynamic movement primitives are extracted to characterize natural task completion in N = 3 able-bodied human subjects. These primitives are then leveraged for trajectory estimation in both a generalized and a subject-specific scheme resulting in 0.187 cm and 0.161 cm regression accuracy, respectively. Our proposed kinematic tracking network is wireless, accurate, and especially useful for predicting voluntary actuation in virtual and augmented reality applications.
UR - http://www.scopus.com/inward/record.url?scp=85060857799&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85060857799&partnerID=8YFLogxK
U2 - 10.1109/BIOCAS.2018.8584805
DO - 10.1109/BIOCAS.2018.8584805
M3 - Conference contribution
AN - SCOPUS:85060857799
T3 - 2018 IEEE Biomedical Circuits and Systems Conference, BioCAS 2018 - Proceedings
BT - 2018 IEEE Biomedical Circuits and Systems Conference, BioCAS 2018 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2018 IEEE Biomedical Circuits and Systems Conference, BioCAS 2018
Y2 - 17 October 2018 through 19 October 2018
ER -