We derive here a new method for the analysis of weight quantization effects
in multilayer perceptrons based on the application of interval arithmetic,
Differently from previous results, we find worst case bounds on the errors
due to weight quantization, that are valid for every distribution of the i
nput or weight values. Given a trained network, our method allows to easily
compute the minimum number of bits needed to encode its weights.