An. Iusem et al., MULTIPLICATIVE INTERIOR GRADIENT METHODS FOR MINIMIZATION OVER THE NONNEGATIVE ORTHANT, SIAM journal on control and optimization, 34(1), 1996, pp. 389-406
We introduce a new class of multiplicative iterative methods for solvi
ng minimization problems over the nonnegative orthant. The algorithm i
s akin to a natural extension of gradient methods for unconstrained mi
nimization problems to the case of nonnegativity constraints, with the
special feature that it generates a sequence of iterates which remain
in the interior of the nonnegative orthant. We prove that the algorit
hm combined with an appropriate line search is weakly convergent to a
saddle point of the minimization problem, when the minimand is a diffe
rentiable function with bounded level sets. If the function is convex,
then weak convergence to an optimal solution is obtained. Moreover, b
y using an appropriate regularized line search, we prove that the leve
l set boundeness hypothesis can be removed, and full convergence of th
e iterates to an optimal solution is established in the convex case.