Discounting older data

Kyle A. Caudle, Gary O. Fowler, Leah Ruth Jager, David M. Ruth

Research output: Contribution to journalArticle

Abstract

This article describes two general methods for discounting older data in the real-time analysis of a data stream. In the first method, the distribution of a data stream is estimated by a series of orthogonal basis functions, and the coefficients of this estimate are updated as new data arrive by combining windowing and exponential smoothing techniques. The second method involves sequential hypothesis testing. When new data arrive, test significance level is adjusted by alpha-investing, which raises or reduces the significance level of subsequent hypothesis tests on the basis of whether the previous hypothesis test rejects or fails to reject the null hypothesis. Both these methods are nonparametric in nature.

Original languageEnglish (US)
Pages (from-to)30-33
Number of pages4
JournalWiley Interdisciplinary Reviews: Computational Statistics
Volume3
Issue number1
DOIs
StatePublished - Jan 2011
Externally publishedYes

Fingerprint

Discounting
Significance level
Data Streams
Exponential Smoothing
Sequential Testing
Test of Hypothesis
Orthogonal Functions
Smoothing Techniques
Orthogonal Basis
Hypothesis Test
Hypothesis Testing
Null hypothesis
Basis Functions
Real-time
Series
Coefficient
Estimate

ASJC Scopus subject areas

  • Statistics and Probability

Cite this

Discounting older data. / Caudle, Kyle A.; Fowler, Gary O.; Jager, Leah Ruth; Ruth, David M.

In: Wiley Interdisciplinary Reviews: Computational Statistics, Vol. 3, No. 1, 01.2011, p. 30-33.

Research output: Contribution to journalArticle

Caudle, Kyle A. ; Fowler, Gary O. ; Jager, Leah Ruth ; Ruth, David M. / Discounting older data. In: Wiley Interdisciplinary Reviews: Computational Statistics. 2011 ; Vol. 3, No. 1. pp. 30-33.
@article{62b491f8b34e489b9a3e06dac549ea0d,
title = "Discounting older data",
abstract = "This article describes two general methods for discounting older data in the real-time analysis of a data stream. In the first method, the distribution of a data stream is estimated by a series of orthogonal basis functions, and the coefficients of this estimate are updated as new data arrive by combining windowing and exponential smoothing techniques. The second method involves sequential hypothesis testing. When new data arrive, test significance level is adjusted by alpha-investing, which raises or reduces the significance level of subsequent hypothesis tests on the basis of whether the previous hypothesis test rejects or fails to reject the null hypothesis. Both these methods are nonparametric in nature.",
author = "Caudle, {Kyle A.} and Fowler, {Gary O.} and Jager, {Leah Ruth} and Ruth, {David M.}",
year = "2011",
month = "1",
doi = "10.1002/wics.134",
language = "English (US)",
volume = "3",
pages = "30--33",
journal = "Wiley Interdisciplinary Reviews: Computational Statistics",
issn = "1939-5108",
publisher = "John Wiley and Sons Inc.",
number = "1",

}

TY - JOUR

T1 - Discounting older data

AU - Caudle, Kyle A.

AU - Fowler, Gary O.

AU - Jager, Leah Ruth

AU - Ruth, David M.

PY - 2011/1

Y1 - 2011/1

N2 - This article describes two general methods for discounting older data in the real-time analysis of a data stream. In the first method, the distribution of a data stream is estimated by a series of orthogonal basis functions, and the coefficients of this estimate are updated as new data arrive by combining windowing and exponential smoothing techniques. The second method involves sequential hypothesis testing. When new data arrive, test significance level is adjusted by alpha-investing, which raises or reduces the significance level of subsequent hypothesis tests on the basis of whether the previous hypothesis test rejects or fails to reject the null hypothesis. Both these methods are nonparametric in nature.

AB - This article describes two general methods for discounting older data in the real-time analysis of a data stream. In the first method, the distribution of a data stream is estimated by a series of orthogonal basis functions, and the coefficients of this estimate are updated as new data arrive by combining windowing and exponential smoothing techniques. The second method involves sequential hypothesis testing. When new data arrive, test significance level is adjusted by alpha-investing, which raises or reduces the significance level of subsequent hypothesis tests on the basis of whether the previous hypothesis test rejects or fails to reject the null hypothesis. Both these methods are nonparametric in nature.

UR - http://www.scopus.com/inward/record.url?scp=78651526428&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=78651526428&partnerID=8YFLogxK

U2 - 10.1002/wics.134

DO - 10.1002/wics.134

M3 - Article

AN - SCOPUS:78651526428

VL - 3

SP - 30

EP - 33

JO - Wiley Interdisciplinary Reviews: Computational Statistics

JF - Wiley Interdisciplinary Reviews: Computational Statistics

SN - 1939-5108

IS - 1

ER -