Given a channel and an input process, the minimum randomness of those
input processes whose output statistics approximate the original outpu
t statistics with arbitrary accuracy is studied. The notion of resolva
bility of a channel, defined as the number of random bits required per
channel use in order to generate an input that achieves arbitrarily a
ccurate approximation of the output statistics for any given input pro
cess, is introduced. A general formula for resolvability that holds re
gardless of the channel memory structure, is obtained. It is shown tha
t, for most channels, resolvability is equal to Shannon capacity. By-p
roducts of the analysis are a general formula for the minimum achievab
le (fixed-length) source coding rate of any finite-alphabet source, an
d a strong converse of the identification coding theorem, which holds
for any channel that satisfies the strong converse of the channel codi
ng theorem.