This paper presents a wireless kinematic tracking framework used for biomechanical analysis during rehabilitative tasks in augmented and virtual reality. The framework uses low-cost inertial measurement units and exploits the rigid connections of the human skeletal system to provide egocentric position estimates of joints to centimeter accuracy. On-board sensor fusion combines information from three-axis accelerometers, gyroscopes, and magnetometers to provide robust estimates in real-time. Sensor precision and accuracy were validated using the root mean square error of estimated joint angles against ground truth goniometer measurements. The sensor network produced a mean estimate accuracy of 2.81° with 1.06° precision, resulting in a maximum hand tracking error of 7.06 cm. As an application, the network is used to collect kinematic information from an unconstrained object manipulation task in augmented reality, from which dynamic movement primitives are extracted to characterize natural task completion in N = 3 able-bodied human subjects. These primitives are then leveraged for trajectory estimation in both a generalized and a subject-specific scheme resulting in 0.187 cm and 0.161 cm regression accuracy, respectively. Our proposed kinematic tracking network is wireless, accurate, and especially useful for predicting voluntary actuation in virtual and augmented reality applications.