Showing 1 - 10 of 29
In this paper, we develop new subgradient methods for solving nonsmooth convex optimization problems. These methods are the first ones, for which the whole sequence of test points is endowed with the worst-case performance guarantees. The new methods are derived from a relaxed estimating...
Persistent link: https://www.econbiz.de/10010927696
In this paper we suggest a new framework for constructing mathematical models of market activity. Contrary to the majority of the classical economical models (e.g. Arrow- Debreu, Walras, etc.), we get a characterization of general equilibrium of the market as a saddle point in a convex-concave...
Persistent link: https://www.econbiz.de/10010752813
In this paper we propose a new interior-point method, which is based on an extension of the ideas of self-scaled optimization to the general cases. We suggest using the primal correction process to find a scaling point. This point is used to compute a strictly feasible primal-dual pair by simple...
Persistent link: https://www.econbiz.de/10005042857
In many applications it is possible to justify a reasonable bound for possible variation of subgradients of objective function rather than for their uniform magnitude. In this paper we develop a new class of efficient primal-dual subgradient schemes for such problem classes.
Persistent link: https://www.econbiz.de/10005043014
In this paper we present a new approach for constructing subgradient schemes for different types of nonsmooth problems with convex structure. Our methods are primaldual since they are always able to generate a feasible approximation to the optimum of an appropriately formulated dual problem....
Persistent link: https://www.econbiz.de/10005043237
In this paper we derive effciency estimates of the regularized Newton's method as applied to constrained convex minimization problems and to variational inequalities. We study a one- step Newton's method and its multistep accelerated version, which converges on smooth convex problems as O( 1 k3...
Persistent link: https://www.econbiz.de/10005043350
In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two convex terms: one is smooth and given by a black-box oracle, and another is general but simple and its structure is known. Despite to the bad properties of the sum,...
Persistent link: https://www.econbiz.de/10005008277
In this paper we propose a new approach for constructing efficient schemes for nonsmooth convex optimization. It is based on a special smoothing technique, which can be applied to the functions with explicit max-structure. Our approach can be considered as an alternative to black-box...
Persistent link: https://www.econbiz.de/10005008345
In this paper we propose an accelerated version of the cubic regularization of Newton's method [6]. The original version, used for minimizing a convex function with Lipschitz-continuous Hessian, guarantees a global rate of convergence of order O(1/k exp.2), where k is the iteration counter. Our...
Persistent link: https://www.econbiz.de/10005065351
Persistent link: https://www.econbiz.de/10011992624