Who's afraid of reduced-rank parameterizations of multivariate models? Theory and example
Reduced-rank restrictions can add useful parsimony to coefficient matrices of multivariate models, but their use is limited by the daunting complexity of the methods and their theory. The present work takes the easy road, focusing on unifying themes and simplified methods. For Gaussian and non-Gaussian (GLM, GAM, mixed normal, etc.) multivariate models, the present work gives a unified, explicit theory for the general asymptotic (normal) distribution of maximum likelihood estimators (MLE). MLE can be complex and computationally hard, but we show a strong asymptotic equivalence between MLE and a relatively simple minimum (Mahalanobis) distance estimator. The latter method yields particularly simple tests of rank, and we describe its asymptotic behavior in detail. We also examine the method's performance in simulation and via analytical and empirical examples.
Year of publication: |
2006
|
---|---|
Authors: | Gilbert, Scott ; ZemcĂk, Petr |
Published in: |
Journal of Multivariate Analysis. - Elsevier, ISSN 0047-259X. - Vol. 97.2006, 4, p. 925-945
|
Publisher: |
Elsevier |
Keywords: | Multivariate model Regression Coefficient matrix Reduced-rank Estimation Test Asymptotic theory |
Saved in:
Saved in favorites
Similar items by person
-
Testing for Latent Factors in Models with Autocorrelation and Heteroskedasticity of Unknown Form
Gilbert, Scott, (2005)
-
Do house prices reflect fundamentals? Aggregate and panel data evidence
Mikhed, Vyacheslav, (2009)
-
Testing the distribution of error components in panel data models
Gilbert, Scott, (2002)
- More ...