Bayesian Multimodel Inference by RJMCMC: A Gibbs Sampling Approach
Bayesian multimodel inference treats a set of candidate models as the sample space of a latent categorical random variable, sampled once; the data at hand are modeled as having been generated according to the sampled model. Model selection and model averaging are based on the posterior probabilities for the model set. Reversible-jump Markov chain Monte Carlo (RJMCMC) extends ordinary MCMC methods to this meta-model. We describe a version of RJMCMC that intuitively represents the process as Gibbs sampling with alternating updates of a categorical variable <italic>M</italic> (for <italic>Model</italic>) and a "palette" of parameters <inline-graphic xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="utas_a_791644_o_ilm0001.gif"/>, from which any of the model-specific parameters can be calculated. Our representation makes plain how model-specific Monte Carlo outputs (analytical or numerical) can be post-processed to compute model weights or Bayes factors. We illustrate the procedure with several examples.
Year of publication: |
2013
|
---|---|
Authors: | Barker, Richard J. ; Link, William A. |
Published in: |
The American Statistician. - Taylor & Francis Journals, ISSN 0003-1305. - Vol. 67.2013, 3, p. 150-156
|
Publisher: |
Taylor & Francis Journals |
Saved in:
Saved in favorites
Similar items by person
-
Link, William A., (2005)
-
Bayesian Multimodel Inference by RJMCMC: A Gibbs Sampling Approach
Barker, Richard J., (2013)
-
Barker, Richard J., (1961)
- More ...