(X, Y) is a pair of random variables distributed over a support set S.
Person P(X) knows X, person P(Y) knows Y, and both know S. Using a pr
edetermined protocol, they exchange binary messages for P(Y) to learn
X. P(X) may or may not learn Y. The m-message complexity C(m) is the n
umber of information bits that must be transmitted (by both persons) i
n the worst case if only m messages are allowed. C(infinity) is the nu
mber of bits required when there is no restriction on the number of me
ssages exchanged. A natural class of random pairs is considered. Mu is
the maximum number of X values possible with a given Y value. Eta is
the maximum number of Y values possible with a given X value. The rand
om pair (X, Y) is balanced if mu = eta. The following hold for all bal
anced random pairs. One-way communication requires at most twice the m
inimum number of bits: C1 less-than-or-equal-to 2C(infinity) + 1. This
bound is almost tight: For every alpha, there is a balanced random pa
ir for which C1 greater-than-or-equal-to 2C(infinity) - 6 greater-than
-or-equal-to alpha. Three-message communication is asymptotically opti
mum, C3 less-than-or-equal-to C(infinity) + 3 log C(infinity) + 11. Mo
re importantly, the number of bits required is only negligibly larger
than the number needed when P(X) knows Y in advance, C(infinity) less-
than-or-equal-to C3 less-than-or-equal-to log mu + 3 log log mu + 11.
These results are applied to the following correlated files problem. X
and Y are binary strings (files) within a small edit distance from ea
ch other. P(X) knows X, while P(Y) knows Y and wants to learn X. The a
bove results imply efficient three-message protocols for conveying X t
o P(Y). Efficient one-way protocols are provided for certain restricte
d cases and their possible generalizations are discussed.