While learning an unknown input-output task, humans first strive to un
derstand the qualitative structure of the function. Accuracy of perfor
mance is then improved with practice. In contrast, existing neural net
work function approximators do not have an explicit means for abstract
ing the qualitative structure of a target function. To fill this gap,
we introduce the concept of function emulation, according to which the
central goal of training is to ''emulate'' the qualitative structure
of the target function. The framework of catastrophe or singularity th
eory is used to characterize the qualitative structure of a smooth fun
ction, which is organized by the critical points of the function. The
proposed scheme of function emulation uses the radial basis function n
etwork to realize a modular architecture wherein each module emulates
the target function in the neighborhood of a critical point. The netwo
rk size required to emulate the target in the neighborhood of a critic
al point is shown to be related to a certain complexity measure of the
critical point. For a large class of smooth functions, the present sc
heme produces a graph-like abstraction of the target, thereby providin
g a qualitative representation of a quantitative input-output relation
. (C) 1997 Elsevier Science Ltd.