We present a novel training algorithm for a feed forward neural networ
k with a single hidden layer of nodes (i.e., two layers of connection
weights). Our algorithm is capable of training networks for hard probl
ems, such as the classic two-spirals problem. The weights in the first
layer are determined using a quasirandom number generator. These weig
hts are frozen-they are never modified during the training process. Th
e second layer of weights is trained as a simple linear discriminator
using methods such as the pseudoinverse, with possible iterations. We
also study the problem of reducing the hidden layer: pruning low-weigh
t nodes and a genetic algorithm search for good subsets.