For general Bayes decision rules there are considered perceptron approximat
ions based on sufficient statistics inputs. A particular attention is paid
to Bayes discrimination and classification. In the case of exponentially di
stributed data with known model it is shown that a perceptron with one hidd
en layer is sufficient and the learning is restricted to synaptic weights o
f the output neuron. If only the dimension of the exponential model is know
n, then the number of hidden layers will increase by one and also the synap
tic weights of neurons from both hidden layers have to be learned.