Exact common information

Type: Article

Publication Date: 2014-06-01

Citations: 65

DOI: https://doi.org/10.1109/isit.2014.6874815

Download PDF

Abstract

This paper introduces the notion of exact common information, which is the minimum description length of the common randomness needed for the exact distributed generation of two correlated random variables (X, Y). We introduce the quantity G(X; Y) = min <inf xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">X→W→Y</inf> H(W) as a natural bound on the exact common information and study its properties and computation. We then introduce the exact common information rate, which is the minimum description rate of the common randomness for the exact generation of a 2-DMS (X, Y). We give a multiletter characterization for it as the limit Ḡ(X; Y) = lim <inf xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n→∞</inf> (1/n)G(X <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n</sup> ; Y <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n</sup> ). While in general Ḡ(X; Y) is greater than or equal to the Wyner common information, we show that they are equal for the Symmetric Binary Erasure Source. We do not know, however, if the exact common information rate has a single letter characterization in general.

Locations

  • arXiv (Cornell University) - View - PDF

Similar Works

Action Title Year Authors
+ Exact Common Information 2014 Gowtham Kumar
Cheuk Ting Li
Abbas El Gamal
+ Exact Common Information 2014 Gowtham Kumar
Cheuk Ting Li
Abbas El Gamal
+ On Exact and $\infty$-Rényi Common Informations 2018 Lei Yu
Vincent Y. F. Tan
+ A Lower and Upper Bound on the Epsilon-Uniform Common Randomness Capacity 2023 Rami Ezzine
Moritz Wiese
Christian Deppe
Holger Boche
+ PDF Chat A Lower and Upper Bound on the Epsilon-Uniform Common Randomness Capacity 2023 Rami Ezzine
Moritz Wiese
Christian Deppe
Holger Boche
+ PDF Chat Common Randomness Generation from Sources with Countable Alphabet 2023 Wafa Labidi
Rami Ezzine
Christian Deppe
Moritz Wiese
Holger Boche
+ Common Information, Noise Stability, and Their Extensions 2022 Lei Yu
Vincent Y. F. Tan
+ PDF Chat Communication Complexity of Common Randomness Generation with Isotropic States 2024 Yangjing Dong
Penghui Yao
+ Common Information Dimension 2023 Osama A. Hanna
Xinlin Li
Suhas Diggavi
Christina Fragouli
+ Common Randomness Generation from Gaussian Sources 2022 Wafa Labidi
Rami Ezzine
Christian Deppe
Holger Boche
+ Common Randomness Generation from Sources with Countable Alphabet 2022 Wafa Labidi
Rami Ezzine
Christian Deppe
Moritz Wiese
Holger Boche
+ PDF Chat Common Randomness Generation from Gaussian Sources 2022 Wafa Labidi
Rami Ezzine
Christian Deppe
Holger Boche
+ On the Role of Shared Randomness in Simultaneous Communication 2015 Mohammad Bavarian
Dmitry Gavinsky
Tsuyoshi Ito
+ Communication for Generating Correlation: A Unifying Survey 2019 Madhu Sudan
Himanshu Tyagi
Shun Watanabe
+ Lower bound on relaxed Wyner's Common Information 2021 Erixhen Sula
Michael Gastpar
+ Communication for Generating Correlation: A Unifying Survey 2019 Madhu Sudan
Himanshu Tyagi
Shun Watanabe
+ Communication Complexity of Common Randomness Generation with Isotropic States 2023 Yangjing Dong
Penghui Yao
+ PDF Chat Wyner’s Common Information Under Rényi Divergence Measures 2018 Lei Yu
Vincent Y. F. Tan
+ Common Randomness and Key Generation with Limited Interaction. 2016 Jingbo Liu
Paul Cuff
Sergio Verdú
+ PDF Chat Wyner's Common Information under Renyi Divergence Measures 2018 Lei Yu
Vincent Y. F. Tan