Inference on exponential families with mixture of prior distributions
A Bayesian analysis of the natural exponential families with quadratic variance function when there are several sources of prior information is considered. The belief of each source is expressed as a conjugate prior distribution. Then, a mixture of them is considered to represent a consensus of the sources. A unified framework considering unknown weights is presented. Firstly, a general procedure based on Kullback-Leibler (K-L) distance to obtain the weights is proposed. The main advantage is that the weights can be analytically calculated. In addition, expressions that allow a direct implementation for these families are shown. Secondly, the experts' prior beliefs are calibrated with respect to the combined posterior belief by using K-L distances. A straightforward Monte Carlo-based approach to estimate these distances is proposed. Finally, two illustrative examples are presented to show the ease of application of the proposed technique, as well as its usefulness in a Bayesian framework.
Year of publication: |
2009
|
---|---|
Authors: | Rufo, M.J. ; Martín, J. ; Pérez, C.J. |
Published in: |
Computational Statistics & Data Analysis. - Elsevier, ISSN 0167-9473. - Vol. 53.2009, 9, p. 3271-3280
|
Publisher: |
Elsevier |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
New approaches to compute Bayes factor in finite mixture models
Rufo, M.J., (2010)
-
Merging experts' opinions: A Bayesian hierarchical model with mixture of prior distributions
Rufo, M.J., (2010)
-
Merging experts’ opinions: A Bayesian hierarchical model with mixture of prior distributions
Rufo, M.J., (2010)
- More ...