Convex Optimization, Shape Constraints, Compound Decisions, and Empirical Bayes Rules
Estimation of mixture densities for the classical Gaussian compound decision problem and their associated (empirical) Bayes rules is considered from two new perspectives. The first, motivated by Brown and Greenshtein, introduces a nonparametric maximum likelihood estimator of the mixture density subject to a monotonicity constraint on the resulting Bayes rule. The second, motivated by Jiang and Zhang, proposes a new approach to computing the Kiefer-Wolfowitz nonparametric maximum likelihood estimator for mixtures. In contrast to prior methods for these problems, our new approaches are cast as convex optimization problems that can be efficiently solved by modern interior point methods. In particular, we show that the reformulation of the Kiefer-Wolfowitz estimator as a convex optimization problem reduces the computational effort by <italic>several orders of magnitude for typical problems</italic>, by comparison to prior EM-algorithm based methods, and thus greatly expands the practical applicability of the resulting methods. Our new procedures are compared with several existing empirical Bayes methods in simulations employing the well-established design of Johnstone and Silverman. Some further comparisons are made based on prediction of baseball batting averages. A Bernoulli mixture application is briefly considered in the penultimate section.
Year of publication: |
2014
|
---|---|
Authors: | Koenker, Roger ; Mizera, Ivan |
Published in: |
Journal of the American Statistical Association. - Taylor & Francis Journals, ISSN 0162-1459. - Vol. 109.2014, 506, p. 674-685
|
Publisher: |
Taylor & Francis Journals |
Saved in:
Saved in favorites
Similar items by person
-
Shape constrained density estimation via penalized Rényi divergence
Koenker, Roger, (2018)
-
What Do Kernel Density Estimators Optimize?
Koenker, Roger, (2012)
-
Penalized triograms: total variation regularization for bivariate smoothing
Koenker, Roger, (2004)
- More ...