Learning from examples plays a central role in artificial neural netwo
rks. The success of many learning schemes is not guaranteed, however,
since algorithms like backpropagation may get stuck in local minima, t
hus providing suboptimal solutions. For feedforward networks, the theo
retical results reported in [5], [6], [15], and [20] show that optimal
learning can be achieved provided that certain conditions on the netw
ork and the learning environment are met. A similar investigation is p
ut forward in this paper for the case of networks using radial basis f
unctions (RBF) [10], [14]. The analysis proposed in [6] is extended na
turally under the assumption that the patterns of the learning environ
ment are separable by hyperspheres. In that case, we prove that the at
tached cost function is local minima free with respect to all the weig
hts. This provides us with some theoretical foundations for a massive
application of RBF in pattern recognition.