A note on some algorithms for the Gibbs posterior
Jiang and Tanner (2008) consider a method of classification using the Gibbs posterior which is directly constructed from the empirical classification errors. They propose an algorithm to sample from the Gibbs posterior which utilizes a smoothed approximation of the empirical classification error, via a Gibbs sampler with augmented latent variables. In this paper, we note some drawbacks of this algorithm and propose an alternative method for sampling from the Gibbs posterior, based on the Metropolis algorithm. The numerical performance of the algorithms is examined and compared via simulated data. We find that the Metropolis algorithm produces good classification results at an improved speed of computation.
Year of publication: |
2010
|
---|---|
Authors: | Chen, Kun ; Jiang, Wenxin ; Tanner, Martin A. |
Published in: |
Statistics & Probability Letters. - Elsevier, ISSN 0167-7152. - Vol. 80.2010, 15-16, p. 1234-1241
|
Publisher: |
Elsevier |
Saved in:
Saved in favorites
Similar items by person
-
GENERAL INEQUALITIES FOR GIBBS POSTERIOR WITH NONADDITIVE EMPIRICAL RISK
Li, Cheng, (2014)
-
RISK MINIMIZATION FOR TIME SERIES BINARY CHOICE WITH VARIABLE SELECTION
Jiang, Wenxin, (2010)
-
Risk minimization for time series binary choice with variable selection
Jiang, Wenxin, (2010)
- More ...