Gradient methods for stochastic optimization in relative scale
Year of publication: |
[2024] ; Version 0.3.0
|
---|---|
Authors: | Nesterov, Jurij Evgenʹevič ; Rodomanov, Anton |
Publisher: |
Louvain-la-Neuve : CORE |
Subject: | convex optimization | optimization in relative scale | gradient methods | randomization | convergence guarantees | eigenvalues | singular values | power method | Lanczos algorithm | Mathematische Optimierung | Mathematical programming | Theorie | Theory | Stochastischer Prozess | Stochastic process | Algorithmus | Algorithm |
-
Optimization methods for fully composite problems
Doikov, Nikita, (2021)
-
Computing B-stationary points of nonsmooth dc programs
Pang, Jong-Shi, (2017)
-
Briceño-Arias, Luis M., (2024)
- More ...
-
Greedy quasi-Newton methods with explicit superlinear convergence
Rodomanov, Anton, (2020)
-
Rates of superlinear convergence for classical quasi-Newton methods
Rodomanov, Anton, (2020)
-
New results on superlinear convergence of classical quasi-Newton methods
Rodomanov, Anton, (2020)
- More ...