Using a Markov Chain to Construct a Tractable Approximation of an Intractable Probability Distribution
Let "π" denote an intractable probability distribution that we would like to explore. Suppose that we have a positive recurrent, irreducible Markov chain that satisfies a minorization condition and has "π" as its invariant measure. We provide a method of using simulations from the Markov chain to construct a statistical estimate of "π" from which it is straightforward to sample. We show that this estimate is 'strongly consistent' in the sense that the total variation distance between the estimate and "π" converges to 0 almost surely as the number of simulations grows. Moreover, we use some recently developed asymptotic results to provide guidance as to how much simulation is necessary. Draws from the estimate can be used to approximate features of "π" or as intelligent starting values for the original Markov chain. We illustrate our methods with two examples. Copyright 2006 Board of the Foundation of the Scandinavian Journal of Statistics..
Year of publication: |
2006
|
---|---|
Authors: | HOBERT, JAMES P. ; JONES, GALIN L. ; ROBERT, CHRISTIAN P. |
Published in: |
Scandinavian Journal of Statistics. - Danish Society for Theoretical Statistics, ISSN 0303-6898. - Vol. 33.2006, 1, p. 37-51
|
Publisher: |
Danish Society for Theoretical Statistics Finnish Statistical Society Norwegian Statistical Association Swedish Statistical Association |
Saved in:
Saved in favorites
Similar items by person
-
Hobert, James P., (2010)
-
Eaton's Markov chain, its conjugate partner and P-admissibility
Hobert, James P., (1997)
-
One perfect simulation for some mixtures of distributions
Hobert, James P., (1998)
- More ...