We have derived a new kind of neural network using normalized radial b
asis functions, ''RBF'', with the same classifying properties as if it
were built up using sigmoid functions. This equivalence is mathematic
ally demonstrated. In addition, to this, we also show that the propose
d network is equivalent to a gaussian classifier. The network does not
require any computing learning time to build a classifier. This netwo
rk has been compared with well known adaptive networks, such as backpr
opagation and linear combination of generalized radial basis functions
(GRBF's). Its adapted forms are presented to see how the classifying
regions and boundaries among the supplied examples are formed. This ne
ural network can be made to have identical classifying properties, as
the nearest neighborhood classifier ''NNC''. In the case of having man
y examples per class, fewer centers can be found using vector quantizi
ng ''VQ'' techniques as done in Kohonen's network. Finally, this neura
l system can also be used to approximate a smooth continuous function,
given sparse examples.