This paper informs a statistical readership about Artificial Neural Ne
tworks (ANNs), points out some of the links with statistical methodolo
gy and encourages cross-disciplinary research in the directions most l
ikely to bear fruit. The areas of statistical interest are briefly out
lined, and a series of examples indicates the flavor of ANN models. We
then treat various topics in more depth. In each case, we describe th
e neural network architectures and training rules and provide a statis
tical commentary. The topics treated in this way are perceptrons (from
single-unit to multilayer versions), Hopfield-type recurrent networks
(including probabilistic versions strongly related to statistical phy
sics and Gibbs distributions) and associative memory networks trained
by so-called unsupervised learning rules. Perceptrons are shown to hav
e strong associations with discriminant analysis and regression, and u
nsupervized networks with cluster analysis. The paper concludes with s
ome thoughts on the future of the interface between neural networks an
d statistics.