The Global Convergence of Self-Scaling BFGS Algorithm with Nonmonotone Line Search for Unconstrained Nonconvex Optimization Problems

Citation
Yin, Hong Xia et Du, Dong Lei, The Global Convergence of Self-Scaling BFGS Algorithm with Nonmonotone Line Search for Unconstrained Nonconvex Optimization Problems, Acta mathematica Sinica. English series (Print) , 23(12), 2007, pp. 1233-1240
ISSN journal
14398516
Volume
23
Issue
12
Year of publication
2007
Pages
1233 - 1240
Database
ACNP
SICI code
Abstract
The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is convex (or even uniformly convex). We propose to solve unconstrained nonconvex optimization problems by a self-scaling BFGS algorithm with nonmonotone linear search. Nonmonotone line search has been recognized in numerical practices as a competitive approach for solving large-scale nonlinear problems. We consider two different nonmonotone line search forms and study the global convergence of these nonmonotone self-scale BFGS algorithms. We prove that, under some weaker condition than that in the literature, both forms of the self-scaling BFGS algorithm are globally convergent for unconstrained nonconvex optimization problems.