To avoid local minimum solutions in back propagation learning, we prop
ose to treat feedforward neural network training as a global optimizat
ion problem. In particular, we considered using branch-and-bound based
Lipschitz optimization methods in neural network training, and develo
ped globally optimal training algorithms (GOTA). The standard criterio
n function of a feedforward neural network is Lipschitzian. The effect
iveness of GOTA is improved by using dynamically computed local Lipsch
itz constants over subsets of the weight space. Local search procedure
s, such as the classic back propagation algorithm, can be incorporated
in GOTA. The local search-augmented global algorithms improve the lea
rning efficiency of GOTA while retaining the globally convergent prope
rty.