Delay time in sequential detection of change
We consider a sequential test procedure which detects possible changes in the mean of observations satisfying a weak invariance principle. Our test statistic is based on weighted CUSUMs of the underlying random variables. In this paper, we study the asymptotic behaviour of the delay time if a change has occurred in the sample after a training period of size m in which the observations stay in control. It turns out that in this situation the limiting distribution of the delay time for m-->[infinity] is normal under a suitable standardization provided the change appeared sufficiently soon after m.
Year of publication: |
2004
|
---|---|
Authors: | Aue, Alexander ; Horváth, Lajos |
Published in: |
Statistics & Probability Letters. - Elsevier, ISSN 0167-7152. - Vol. 67.2004, 3, p. 221-231
|
Publisher: |
Elsevier |
Keywords: | Change-point estimation Sequential procedure Wiener process Partial sums Invariance CUSUM Drift |
Saved in:
Saved in favorites
Similar items by person
-
ON DISTINGUISHING BETWEEN RANDOM WALK AND CHANGE IN THE MEAN ALTERNATIVES
Aue, Alexander, (2009)
-
Limit Laws in Transaction-Level Asset Price Models
Aue, Alexander, (2014)
-
Segmenting mean-nonstationary time series via trending regressions
Aue, Alexander, (2012)
- More ...