NONLINEAR RESCALING AND PROXIMAL-LIKE METHODS IN CONVEX-OPTIMIZATION

Citation
R. Polyak et M. Teboulle, NONLINEAR RESCALING AND PROXIMAL-LIKE METHODS IN CONVEX-OPTIMIZATION, Mathematical programming, 76(2), 1997, pp. 265-284
Citations number
24
Categorie Soggetti
Operatione Research & Management Science",Mathematics,"Operatione Research & Management Science",Mathematics,"Computer Science Software Graphycs Programming
Journal title
ISSN journal
00255610
Volume
76
Issue
2
Year of publication
1997
Pages
265 - 284
Database
ISI
SICI code
0025-5610(1997)76:2<265:NRAPMI>2.0.ZU;2-H
Abstract
The nonlinear rescaling principle (NRP) consists of transforming the o bjective function and/or the constraints of a given constrained optimi zation problem into another problem which is equivalent to the origina l one in the sense that their optimal set of solutions coincides. A no nlinear transformation parameterized by a positive scalar parameter an d based on a smooth scaling function is used to transform the constrai nts. The methods based on NRP consist of sequential unconstrained mini mization of the classical Lagrangian for the equivalent problem, follo wed by an explicit formula updating the Lagrange multipliers. We first show that the NRP leads naturally to proximal methods with an entropy -like kernel, which is defined by the conjugate of the scaling functio n, and establish that the two methods are dually equivalent for convex constrained minimization problems, We then study the convergence prop erties of the nonlinear rescaling algorithm and the corresponding entr opy-like proximal methods for convex constrained optimization problems . Special cases of the nonlinear rescaling algorithm are presented, In particular a new class of exponential penalty-modified barrier functi ons methods is introduced.