The training of multilayered neural networks in the presence of differ
ent types of noise is studied. We consider the learning of realizable
rules in nonoverlapping architectures. Achieving optimal generalizatio
n depends on the knowledge of the noise level, however its misestimati
on may lead to partial or complete loss of the generalization ability.
We demonstrate this effect in the framework of online learning and pr
esent the results in terms of noise robustness phase diagrams. While f
or additive (weight) noise the robustness properties depend on the arc
hitecture and size of the networks, this is not so for multiplicative
(output) noise. In this case we find a universal behaviour independent
of the machine size for both the tree parity and committee machines.