A Brief Survey of Modern Optimization for Statisticians
type="main" xml:id="insr12022-abs-0001"> <title type="main">Summary</title>Modern computational statistics is turning more and more to high-dimensional optimization to handle the deluge of big data. Once a model is formulated, its parameters can be estimated by optimization. Because model parsimony is important, models routinely include non-differentiable penalty terms such as the lasso. This sober reality complicates minimization and maximization. Our broad survey stresses a few important principles in algorithm design. Rather than view these principles in isolation, it is more productive to mix and match them. A few well-chosen examples illustrate this point. Algorithm derivation is also emphasized, and theory is downplayed, particularly the abstractions of the convex calculus. Thus, our survey should be useful and accessible to a broad audience.
Year of publication: |
2014
|
---|---|
Authors: | Lange, Kenneth ; Chi, Eric C. ; Zhou, Hua |
Published in: |
International Statistical Review. - International Statistical Institute (ISI), ISSN 0306-7734. - Vol. 82.2014, 1, p. 46-70
|
Publisher: |
International Statistical Institute (ISI) |
Saved in:
Saved in favorites
Similar items by person
-
Lange, Kenneth, (2014)
-
Stable estimation of a covariance matrix guided by nuclear norm penalties
Chi, Eric C., (2014)
-
Rating Movies and Rating the Raters Who Rate Them
Zhou, Hua, (2009)
- More ...