Adaptive empirical pattern transformation (ADEPT) with application to walking stride segmentation

Marta Karas, Marcin Stra Czkiewicz, William Fadel, Jaroslaw Harezlak, Ciprian M. Crainiceanu, Jacek K. Urbanek

Research output: Contribution to journalArticlepeer-review

Abstract

Quantifying gait parameters and ambulatory monitoring of changes in these parameters have become increasingly important in epidemiological and clinical studies. Using high-density accelerometry measurements, we propose adaptive empirical pattern transformation (ADEPT), a fast, scalable, and accurate method for segmentation of individual walking strides. ADEPT computes the covariance between a scaled and translated pattern function and the data, an idea similar to the continuous wavelet transform. The difference is that ADEPT uses a data-based pattern function, allows multiple pattern functions, can use other distances instead of the covariance, and the pattern function is not required to satisfy the wavelet admissibility condition. Compared to many existing approaches, ADEPT is designed to work with data collected at various body locations and is invariant to the direction of accelerometer axes relative to body orientation. The method is applied to and validated on accelerometry data collected during a $450$-m outdoor walk of $32$ study participants wearing accelerometers on the wrist, hip, and both ankles. Additionally, all scripts and data needed to reproduce presented results are included in supplementary material available at Biostatistics online.

Original languageEnglish (US)
Pages (from-to)331-347
Number of pages17
JournalBiostatistics (Oxford, England)
Volume22
Issue number2
DOIs
StatePublished - Apr 10 2021

Keywords

  • ADEPT
  • Gait
  • Pattern segmentation
  • Physical activity
  • Walking
  • Wearable accelerometers

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'Adaptive empirical pattern transformation (ADEPT) with application to walking stride segmentation'. Together they form a unique fingerprint.

Cite this