A general notion of universal consistency of nonparametric estimators
is introduced that applies to regression estimation, conditional media
n estimation, curve fitting, pattern recognition, and learning concept
s. General methods for proving consistency of estimators based on mini
mizing the empirical error are shown, In particular, distribution-free
almost sure consistency of neural network estimates and generalized l
inear estimators is established.