James-Stein estimators for time series regression models
The least squares (LS) estimator seems the natural estimator of the coefficients of a Gaussian linear regression model. However, if the dimension of the vector of coefficients is greater than 2 and the residuals are independent and identically distributed, this conventional estimator is not admissible. James and Stein [Estimation with quadratic loss, Proceedings of the Fourth Berkely Symposium vol. 1, 1961, pp. 361-379] proposed a shrinkage estimator (James-Stein estimator) which improves the least squares estimator with respect to the mean squares error loss function. In this paper, we investigate the mean squares error of the James-Stein (JS) estimator for the regression coefficients when the residuals are generated from a Gaussian stationary process. Then, sufficient conditions for the JS to improve the LS are given. It is important to know the influence of the dependence on the JS. Also numerical studies illuminate some interesting features of the improvement. The results have potential applications to economics, engineering, and natural sciences.
Year of publication: |
2006
|
---|---|
Authors: | Senda, Motohiro ; Taniguchi, Masanobu |
Published in: |
Journal of Multivariate Analysis. - Elsevier, ISSN 0047-259X. - Vol. 97.2006, 9, p. 1984-1996
|
Publisher: |
Elsevier |
Keywords: | James-Stein estimator Least squares estimator Gaussian stationary process Mean squares error Time series regression model Regression spectrum Residual spectral density matrix |
Saved in:
Saved in favorites
Similar items by person
-
Taniguchi, Masanobu, (1985)
-
Asymptotic theory for the Durbin-Watson statistic under long-memory dependence
Nakamura, Shisei, (1999)
-
Valid edgeworth expansions of M-estimators in regression models with weakly dependent residuals
Taniguchi, Masanobu, (1996)
- More ...