Constructive algorithms have proved to be powerful methods for training fee
dforward neural networks. An important property of these algorithms is gene
ralization. A series of empirical studies were performed to examine the eff
ect of regularization on generalization in constructive cascade algorithms.
It was found that the combination of early stopping and regularization res
ulted in better generalization than the use of early stopping alone. A cubi
c penalty term that greatly penalizes large weights was shown to be benefic
ial for generalization in cascade networks. An adaptive method of setting t
he regularization magnitude in constructive algorithms was introduced and s
hown to produce generalization results similar to those obtained with a fix
ed, user-optimized regularization setting. This adaptive method also result
ed in the construction of smaller networks for more complex problems. The a
casper algorithm, which incorporates the insights obtained from the empiric
al studies, was shown to have good generalization and network construction
properties. This algorithm was compared to the cascade correlation algorith
m on the Proben 1 and additional regression data sets.