Multiresolutional distributed filtering: A novel technique that reduces the amount of data required in high resolution electrocardiography

Mihai Popescu, Paul Cristea, Anastasios Bezerianos

Research output: Contribution to journalArticlepeer-review

14 Scopus citations

Abstract

High resolution ECG analysis is widely accepted as the best non-invasive technique for the assessment of ventricular tachycardia risk in post-myocardial infarction patients. However, the standard analysis approaches involve an extensive averaging procedure which requires long data records, accompanied by the consequent efforts for storage and transmission. This paper outlines an algorithm for multiresolutional distributed filtering, that can significantly reduce the necessary amount of data. The proposed filtering method comprises three basic steps: the dyadic wavelet transform computation, the shrinkage of the wavelet coefficients using adaptive Bayesian rules, and the reconstruction of the denoised signal through the inverse wavelet transform. The performance evaluation using controlled simulation experiments revealed that the present technique could accelerate the noise reduction, preserving the diagnostic value of the signals.

Original languageEnglish (US)
Pages (from-to)195-209
Number of pages15
JournalFuture Generation Computer Systems
Volume15
Issue number2
DOIs
StatePublished - Mar 11 1999
Externally publishedYes

ASJC Scopus subject areas

  • Software
  • Hardware and Architecture
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Multiresolutional distributed filtering: A novel technique that reduces the amount of data required in high resolution electrocardiography'. Together they form a unique fingerprint.

Cite this