Minimax multivariate empirical Bayes estimators under multicollinearity
In this paper we consider the problem of estimating the matrix of regression coefficients in a multivariate linear regression model in which the design matrix is near singular. Under the assumption of normality, we propose empirical Bayes ridge regression estimators with three types of shrinkage functions, that is, scalar, componentwise and matricial shrinkage. These proposed estimators are proved to be uniformly better than the least squares estimator, that is, minimax in terms of risk under the Strawderman's loss function. Through simulation and empirical studies, they are also shown to be useful in the multicollinearity cases.
Year of publication: |
2005
|
---|---|
Authors: | Srivastava, M. S. ; Kubokawa, T. |
Published in: |
Journal of Multivariate Analysis. - Elsevier, ISSN 0047-259X. - Vol. 93.2005, 2, p. 394-416
|
Publisher: |
Elsevier |
Keywords: | Empirical Bayes estimator Ridge regression estimator Multicollinearity Multivariate linear regression model Multivariate normal distribution |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
Estimating the covariance matrix: a new approach
Kubokawa, T., (2003)
-
Robust Improvement in Estimation of a Mean Matrix in an Elliptically Contoured Distribution
Kubokawa, T., (2001)
-
Estimating Risk and the Mean Squared Error Matrix in Stein Estimation
Kubokawa, T., (2002)
- More ...