DETERMINISTIC GLOBAL OPTIMAL FNN TRAINING ALGORITHMS

Citation
Zy. Tang et Gj. Koehler, DETERMINISTIC GLOBAL OPTIMAL FNN TRAINING ALGORITHMS, Neural networks, 7(2), 1994, pp. 301-311
Citations number
28
Categorie Soggetti
Mathematical Methods, Biology & Medicine","Computer Sciences, Special Topics","Computer Science Artificial Intelligence",Neurosciences,"Physics, Applied
Journal title
ISSN journal
08936080
Volume
7
Issue
2
Year of publication
1994
Pages
301 - 311
Database
ISI
SICI code
0893-6080(1994)7:2<301:DGOFTA>2.0.ZU;2-T
Abstract
To avoid local minimum solutions in back propagation learning, we prop ose to treat feedforward neural network training as a global optimizat ion problem. In particular, we considered using branch-and-bound based Lipschitz optimization methods in neural network training, and develo ped globally optimal training algorithms (GOTA). The standard criterio n function of a feedforward neural network is Lipschitzian. The effect iveness of GOTA is improved by using dynamically computed local Lipsch itz constants over subsets of the weight space. Local search procedure s, such as the classic back propagation algorithm, can be incorporated in GOTA. The local search-augmented global algorithms improve the lea rning efficiency of GOTA while retaining the globally convergent prope rty.