摘要: Let X and Y be two random variables with probability distribution p(x,y), joint entropy H(X,Y) conditional entropies H(X \ Y) H(Y X) . Person P/sub x/ knows person Y/ Y. They communicate over a noiseless two-way channel so that both know It is proved that, on the average, at least + bits must exchanged 2 are sufficient. If p(x.y) > 0 for all (x.y), then communicated average. However, if p (x,y) uniform its support set, average number of needed close to H (Y X). Randomized protocols can reduce amount communication considerably but only when some error acceptable.