AN INCREMENTAL GRADIENT(-PROJECTION) METHOD WITH MOMENTUM TERM AND ADAPTIVE STEPSIZE RULE

Authors
Citation
P. Tseng, AN INCREMENTAL GRADIENT(-PROJECTION) METHOD WITH MOMENTUM TERM AND ADAPTIVE STEPSIZE RULE, SIAM journal on optimization, 8(2), 1998, pp. 506-531
Citations number
23
Categorie Soggetti
Mathematics,Mathematics
ISSN journal
10526234
Volume
8
Issue
2
Year of publication
1998
Pages
506 - 531
Database
ISI
SICI code
1052-6234(1998)8:2<506:AIGMWM>2.0.ZU;2-4
Abstract
We consider an incremental gradient method with momentum term for mini mizing the sum of continuously differentiable functions. This method u ses a new adaptive stepsize rule that decreases the stepsize whenever sufficient progress is not made. We show that if the gradients of the functions are bounded and Lipschitz continuous over a certain level se t, then every cluster point of the iterates generated by the method is a stationary point. In addition, if the gradient of the functions hav e a certain growth property, then the method is either linearly conver gent in some sense or the stepsizes are bounded away from zero. The ne w stepsize rule is much in the spirit of heuristic learning rules used in practice for training neural networks via backpropagation. As such , the new stepsize rule may suggest improvements on existing learning rules. Finally, extension of the method and the convergence results to constrained minimization is discussed, as are some implementation iss ues and numerical experience.