A family of estimators for multivariate kurtosis in a nonnormal linear regression model
In this paper, we propose a new estimator for a kurtosis in a multivariate nonnormal linear regression model. Usually, an estimator is constructed from an arithmetic mean of the second power of the squared sample Mahalanobis distances between observations and their estimated values. The estimator gives an underestimation and has a large bias, even if the sample size is not small. We replace this squared distance with a transformed squared norm of the Studentized residual using a monotonic increasing function. Our proposed estimator is defined by an arithmetic mean of the second power of these squared transformed squared norms with a correction term and a tuning parameter. The correction term adjusts our estimator to an unbiased estimator under normality, and the tuning parameter controls the sizes of the squared norms of the residuals. The family of our estimators includes estimators based on ordinary least squares and predicted residuals. We verify that the bias of our new estimator is smaller than usual by constructing numerical experiments.
Year of publication: |
2007
|
---|---|
Authors: | Yanagihara, Hirokazu |
Published in: |
Journal of Multivariate Analysis. - Elsevier, ISSN 0047-259X. - Vol. 98.2007, 1, p. 1-29
|
Publisher: |
Elsevier |
Keywords: | Bias correction Hotelling's T2 distribution Mahalanobis distance Monotonic increasing function Multivariate linear model Nonnormality Predicted residuals Studentized residuals Tuning parameter |
Saved in:
Saved in favorites
Similar items by person
-
Fujikoshi, Yasunori, (2003)
-
Iterative Bias Correction of the CrossâValidation Criterion
YANAGIHARA, HIROKAZU, (2012)
-
Consistency of high-dimensional AIC-type and Cp-type criteria in multivariate linear regression
Fujikoshi, Yasunori, (2014)
- More ...