On the errors-in-variables problem for time series
The usual assumption in the classical errors-in-variables problem of independent measurement errors cannot necessarily be maintained when the data are time series; errors may be strongly serially correlated, possibly containing seasonal effects and trends. When it is possible to identify frequency bands over which the signal-to-noise ratio is large, an approximate solution to the errors-in-variables problem is to omit the remaining frequencies from a time series regression. We draw attention to the danger of "leakage" from the omitted frequencies, and show that the consequent bias can be reduced by means of tapering.
Year of publication: |
1986
|
---|---|
Authors: | Robinson, P. M. |
Published in: |
Journal of Multivariate Analysis. - Elsevier, ISSN 0047-259X. - Vol. 19.1986, 2, p. 240-250
|
Publisher: |
Elsevier |
Keywords: | errors-in-variables frequency domain regression tapers seasonality trend |
Saved in:
Saved in favorites
Similar items by person
-
Estimation and forecasting for time series containing censored or missing observations
Robinson, P. M., (1980)
-
The estimation of linear differential equations with constant coefficients
Robinson, P. M., (1976)
-
Fourier estimation of continuous time models
Robinson, P. M., (1976)
- More ...