We calculate the generalization error epsilon for a <<student>> percep
tron J, trained by a teacher perceptron T, on input patterns S that fo
rm a fixed angle arccos (J.S) with the student. We show that the error
is reduced from a power law to an exponentially fast decay by rejecti
ng input patterns that lie within a given neighbourhood of the decisio
n boundary J.S = 0. On the other hand, the error vs. rejection curve e
psilon(rho), where rho is the fraction of rejected patterns, is shown
to be independent of the training scheme that is employed to construct
the student perceptron. We give a simple argument indicating that the
small-rho behavior observed for the perceptron epsilon(rho) = = epsil
on0 + rho(epsilon0 - 1/2) has a much wider range of validity.