We studied the mutual information between a stimulus and a system consistin
g of stochastic, statistically independent elements that respond to a stimu
lus. Using statistical mechanical methods the properties of the mutual info
rmation (MI) in the limit of a large system size N are calculated. For cont
inuous valued stimuli, the MI increases logarithmically with N and is relat
ed to the log of the Fisher information of the system. For discrete stimuli
the MI saturates exponentially with N. We find that the exponent of satura
tion of the MI is the Chernoff distance between response probabilities that
are induced by different stimuli.