We study the storage of random patterns by a perceptron above its stor
age capacity alpha(c), i.e. in the region where perfect storage become
s impossible. We determine the minimal fraction of learning errors and
the distribution of stabilities for different learning rules in onest
ep replica symmetry breaking. Thereby we not only extend the known rep
lica symmetric results to values of the storage capacity beyond the Ar
line but also show that, depending on the learning rule, replica symm
etry may be globally unstable already well below the AT line. As an ex
ample for possible implications we compare the results for the typical
basins of attraction of an extremely diluted attractor neural network
as given by replica symmetry and one-step replica symmetry breaking.