Consider an optimization problem where the objective function is an in
tegral containing the solution of a system of ordinary differential eq
uations. Suppose we have efficient optimization methods available as w
ell as efficient methods for initial value problems for ordinary diffe
rential equations. The main purpose of this paper is to show how these
methods can be efficiently applied to a considered problem. First, th
e general procedures for the evaluation of gradients and Hessian matri
ces are described. Furthermore, the new efficient Gauss-Newton-like ap
proximation of the Hessian matrix is derived for the special case when
the objective function is an integral of squares. This approximation
is used for deriving the Gauss-Newton-like trust region method, with w
hich global and superlinear convergence properties are proved. Finally
several optimization methods are proposed and computational experimen
ts illustrating their efficiency are shown.