On-line expectation-maximization algorithm for latent data models
We propose a generic on-line (also sometimes called adaptive or recursive) version of the expectation-maximization (EM) algorithm applicable to latent variable models of independent observations. Compared with the algorithm of Titterington, this approach is more directly connected to the usual EM algorithm and does not rely on integration with respect to the complete-data distribution. The resulting algorithm is usually simpler and is shown to achieve convergence to the stationary points of the Kullback-Leibler divergence between the marginal distribution of the observation and the model distribution at the optimal rate, i.e. that of the maximum likelihood estimator. In addition, the approach proposed is also suitable for conditional (or regression) models, as illustrated in the case of the mixture of linear regressions model. Copyright (c) 2009 Royal Statistical Society.
Year of publication: |
2009
|
---|---|
Authors: | Cappé, Olivier ; Moulines, Eric |
Published in: |
Journal of the Royal Statistical Society Series B. - Royal Statistical Society - RSS, ISSN 1369-7412. - Vol. 71.2009, 3, p. 593-613
|
Publisher: |
Royal Statistical Society - RSS |
Saved in:
Saved in favorites
Similar items by person
-
Reversible jump, birth-and-death and more general continuous time Markov chain Monte Carlo samplers
Cappé, Olivier, (2003)
-
General - Markov Chain Monte Carlo: 10 Years and Still Running!
Cappé, Olivier, (2000)
-
Reversible jump, birth-and-death and more general continuous time Markov chain Monte Carlo samplers
Cappé, Olivier, (2003)
- More ...