Let En be a statistical experiment based on n i.i.d. observations. We compare En with En+rn. The gain of information due to the rn additional observations is measured by the deficiency distance Δ(En,En+rn), i.e., the maximum diminution of the risk functions. We show that under general dimensionality conditions Δ(En,En+rn) is of order rn/n. Further the behavior of Δ is studied and compared for asymptotically Gaussian experiments. We show that the information gain increases logarithmically. The Gaussian and the binomial family turn out to be--in some sense--opposite extreme cases, with the increase of information asymptotically minimal in the Gaussian case and maximal in the binomial.