LEARNING WITHOUT LOCAL MINIMA IN RADIAL BASIS FUNCTION NETWORKS

Citation
M. Bianchini et al., LEARNING WITHOUT LOCAL MINIMA IN RADIAL BASIS FUNCTION NETWORKS, IEEE transactions on neural networks, 6(3), 1995, pp. 749-756
Citations number
21
Categorie Soggetti
Computer Application, Chemistry & Engineering","Engineering, Eletrical & Electronic","Computer Science Artificial Intelligence","Computer Science Hardware & Architecture","Computer Science Theory & Methods
ISSN journal
10459227
Volume
6
Issue
3
Year of publication
1995
Pages
749 - 756
Database
ISI
SICI code
1045-9227(1995)6:3<749:LWLMIR>2.0.ZU;2-W
Abstract
Learning from examples plays a central role in artificial neural netwo rks. The success of many learning schemes is not guaranteed, however, since algorithms like backpropagation may get stuck in local minima, t hus providing suboptimal solutions. For feedforward networks, the theo retical results reported in [5], [6], [15], and [20] show that optimal learning can be achieved provided that certain conditions on the netw ork and the learning environment are met. A similar investigation is p ut forward in this paper for the case of networks using radial basis f unctions (RBF) [10], [14]. The analysis proposed in [6] is extended na turally under the assumption that the patterns of the learning environ ment are separable by hyperspheres. In that case, we prove that the at tached cost function is local minima free with respect to all the weig hts. This provides us with some theoretical foundations for a massive application of RBF in pattern recognition.