Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information

Type: Article

Publication Date: 2014-05-09

Citations: 49

DOI: https://doi.org/10.1109/tit.2014.2322861

Abstract

The sumset and inverse sumset theories of Freiman, Pl\"{u}nnecke and Ruzsa, give bounds connecting the cardinality of the sumset $A+B=\{a+b\;;\;a\in A,\,b\in B\}$ of two discrete sets $A,B$, to the cardinalities (or the finer structure) of the original sets $A,B$. For example, the sum-difference bound of Ruzsa states that, $|A+B|\,|A|\,|B|\leq|A-B|^3$, where the difference set $A-B= \{a-b\;;\;a\in A,\,b\in B\}$. Interpreting the differential entropy $h(X)$ of a continuous random variable $X$ as (the logarithm of) the size of the effective support of $X$, the main contribution of this paper is a series of natural information-theoretic analogs for these results. For example, the Ruzsa sum-difference bound becomes the new inequality, $h(X+Y)+h(X)+h(Y)\leq 3h(X-Y)$, for any pair of independent continuous random variables $X$ and $Y$. Our results include differential-entropy versions of Ruzsa's triangle inequality, the Pl\"{u}nnecke-Ruzsa inequality, and the Balog-Szemer\'{e}di-Gowers lemma. Also we give a differential entropy version of the Freiman-Green-Ruzsa inverse-sumset theorem, which can be seen as a quantitative converse to the entropy power inequality. Versions of most of these results for the discrete entropy $H(X)$ were recently proved by Tao, relying heavily on a strong, functional form of the submodularity property of $H(X)$. Since differential entropy is {\em not} functionally submodular, in the continuous case many of the corresponding discrete proofs fail, in many cases requiring substantially new proof strategies. We find that the basic property that naturally replaces the discrete functional submodularity, is the data processing property of mutual information.

Locations

  • arXiv (Cornell University) - View - PDF
  • DataCite API - View
  • IEEE Transactions on Information Theory - View

Similar Works

Action Title Year Authors
+ Sumset inequalities for differential entropy and mutual information 2012 Ioannis Kontoyiannis
Mokshay Madiman
+ The entropy of sums and Rusza's divergence on abelian groups 2013 Ioannis Kontoyiannis
Mokshay Madiman
+ Equivalence of Additive-Combinatorial Linear Inequalities for Shannon Entropy and Differential Entropy 2018 Ashok Vardhan Makkuva
Yihong Wu
+ PDF Chat On an entropic analogue of additive energy 2024 Marcel K. Goh
+ Entropy versions of additive inequalities 2018 Alberto Espuny Díaz
Oriol Serra
+ Entropy versions of additive inequalities 2018 Alberto Espuny Díaz
Oriol Serra
+ On the entropy of sums 2008 Mokshay Madiman
+ On additive-combinatorial affine inequalities for Shannon entropy and differential entropy 2016 Ashok Vardhan Makkuva
Yihong Wu
+ Relative log-concavity and a pair of triangle inequalities 2010 Yaming Yu
+ PDF Chat Sumset and Inverse Sumset Theory for Shannon Entropy 2010 Terence Tao
+ Equivalence of additive-combinatorial linear inequalities for Shannon entropy and differential entropy 2016 Ashok Vardhan Makkuva
Yihong Wu
+ Entropy methods for sumset inequalities 2016 Alberto Espuny Díaz
+ On the entropy of the sum and of the difference of independent random variables 2008 Amos Lapidoth
Gábor Pete
+ PDF Chat A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information 2007 Olivier Rioul
+ Information Inequalities for Joint Distributions, with Interpretations and Applications 2009 Mokshay Madiman
Prasad Tetali
+ Information Inequalities for Joint Distributions, with Interpretations and Applications 2008 Mokshay Madiman
Prasad Tetali
+ PDF Chat Fractional Subadditivity of Submodular Functions: Equality Conditions and Their Applications 2025 Gunank Jakhar
Gowtham R. Kurri
Suryajith Chillara
Vinod M. Prabhakaran
+ PDF Chat A New Entropy Power Inequality for Integer-Valued Random Variables 2014 Saeid Haghighatshoar
Emmanuel Abbé
I.E. Telatar
+ PDF Chat Information Inequalities for Joint Distributions, With Interpretations and Applications 2010 Mokshay Madiman
Prasad Tetali
+ Reversal of Rényi Entropy Inequalities Under Log-Concavity 2020 James Melbourne
Tomasz Tkocz