Type: Article
Publication Date: 2014-06-01
Citations: 65
DOI: https://doi.org/10.1109/isit.2014.6874815
This paper introduces the notion of exact common information, which is the minimum description length of the common randomness needed for the exact distributed generation of two correlated random variables (X, Y). We introduce the quantity G(X; Y) = min <inf xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">X→W→Y</inf> H(W) as a natural bound on the exact common information and study its properties and computation. We then introduce the exact common information rate, which is the minimum description rate of the common randomness for the exact generation of a 2-DMS (X, Y). We give a multiletter characterization for it as the limit Ḡ(X; Y) = lim <inf xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n→∞</inf> (1/n)G(X <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n</sup> ; Y <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n</sup> ). While in general Ḡ(X; Y) is greater than or equal to the Wyner common information, we show that they are equal for the Symmetric Binary Erasure Source. We do not know, however, if the exact common information rate has a single letter characterization in general.
Action | Title | Year | Authors |
---|---|---|---|
+ PDF Chat | Distributed Channel Synthesis | 2013 |
Paul Cuff |
+ | The common information of two dependent random variables | 1975 |
A.D. Wyner |
+ PDF Chat | Coordination Capacity | 2010 |
Paul Cuff Haim H. Permuter Thomas M. Cover |