The class of mapping networks is a general family of tools to perform
a wide variety of tasks; however, no unifying framework exists to desc
ribe their theoretical and practical properties. This paper presents a
standardized, uniform representation for this class of networks, and
introduces a simple modification of the multilayer perceptron with int
eresting practical properties, especially well suited to cope with pat
tern classification tasks. The proposed model unifies the two main rep
resentation paradigms found in the class of mapping networks for class
ification, namely, the surface-based and the prototype-based schemes,
while retaining the advantage of being trainable by backpropagation. T
he enhancement in the representation properties and the generalization
performance are assessed through results about the worst-case require
ment in terms of hidden units and about the Vapnik-Chervonenkis dimens
ion and Cover capacity. The theoretical properties of the network also
suggest that the proposed modification to the multilayer perceptron i
s in many senses optimal. A number of experimental verifications also
confirm theoretical results about the model's increased performances,
as compared with the multilayer perceptron and the Gaussian radial bas
is Functions network.