Modeling covariance matrices via partial autocorrelations
We study the role of partial autocorrelations in the reparameterization and parsimonious modeling of a covariance matrix. The work is motivated by and tries to mimic the phenomenal success of the partial autocorrelations function (PACF) in model formulation, removing the positive-definiteness constraint on the autocorrelation function of a stationary time series and in reparameterizing the stationarity-invertibility domain of ARMA models. It turns out that once an order is fixed among the variables of a general random vector, then the above properties continue to hold and follow from establishing a one-to-one correspondence between a correlation matrix and its associated matrix of partial autocorrelations. Connections between the latter and the parameters of the modified Cholesky decomposition of a covariance matrix are discussed. Graphical tools similar to partial correlograms for model formulation and various priors based on the partial autocorrelations are proposed. We develop frequentist/Bayesian procedures for modelling correlation matrices, illustrate them using a real dataset, and explore their properties via simulations.
Year of publication: |
2009
|
---|---|
Authors: | Daniels, M.J. ; Pourahmadi, M. |
Published in: |
Journal of Multivariate Analysis. - Elsevier, ISSN 0047-259X. - Vol. 100.2009, 10, p. 2352-2363
|
Publisher: |
Elsevier |
Keywords: | Autoregressive parameters Cholesky decomposition Positive-definiteness constraint Levinson-Durbin algorithm Prediction variances Uniform and reference priors Markov chain Monte Carlo |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
Fully Bayesian inference under ignorable missingness in the presence of auxiliary covariates
Daniels, M.J., (2014)
-
Wang, Y., (2013)
-
Hogan, J.W., (2002)
- More ...