An optimal uniform concentration inequality for discrete entropies on finite alphabets in the high-dimensional setting

Type: Article

Publication Date: 2022-04-25

Citations: 4

DOI: https://doi.org/10.3150/21-bej1403

Abstract

We prove an exponential decay concentration inequality to bound the tail probability of the difference between the log-likelihood of discrete random variables on a finite alphabet and the negative entropy. The concentration bound we derive holds uniformly over all parameter values. The new result improves the convergence rate in an earlier result of Zhao (2020), from (K2logK)∕n=o(1) to (logK)2∕n=o(1), where n is the sample size and K is the size of the alphabet. We further prove that the rate (logK)2∕n=o(1) is optimal. The result is extended to misspecified log-likelihoods for grouped random variables. We give applications of the new result in information theory.

Locations

  • Bernoulli - View
  • arXiv (Cornell University) - View - PDF

Similar Works

Action Title Year Authors
+ An Optimal Uniform Concentration Inequality for Discrete Entropies on Finite Alphabets in the High-dimensional Setting 2020 Yunpeng Zhao
+ On Optimal Uniform Concentration Inequalities for Discrete Entropies in the High-dimensional Setting. 2020 Yunpeng Zhao
+ Concentration Inequalities for the Empirical Distribution 2018 Jay Mardia
Jiantao Jiao
Ervin Tánczos
Robert D. Nowak
Tsachy Weissman
+ Concentration inequalities for the empirical distribution of discrete distributions: beyond the method of types 2019 Jay Mardia
Jiantao Jiao
Ervin Tánczos
Robert D. Nowak
Tsachy Weissman
+ Entropy-variance inequalities for discrete log-concave random variables via degree of freedom 2022 Heshan Aravinda
+ PDF Chat On the monotonicity of discrete entropy for log-concave random vectors on $\mathbb{Z}^d$ 2024 Matthieu Fradelizi
Lampros Gavalakis
Martin Rapaport
+ PDF Chat On the Maximum Entropy of a Sum of Independent Discrete Random Variables 2021 Mladen Kovačević
+ Entropy-variance inequalities for discrete log-concave random variables via degree of freedom 2023 Heshan Aravinda
+ PDF Chat Dimension-Free Empirical Entropy Estimation 2022 Doron Cohen
Aryeh Kontorovich
Aaron Koolyk
Geoffrey Wolfer
+ Dimension-Free Empirical Entropy Estimation 2021 Doron Cohen
Aryeh Kontorovich
Aaron Koolyk
Geoffrey Wolfer
+ Estimating a discrete log-concave distribution in higher dimensions 2017 Hanna Jankowski
Amanda Tian
+ Dimension-Free Empirical Entropy Estimation 2021 Doron Cohen
Aryeh Kontorovich
Aaron Koolyk
Geoffrey Wolfer
+ PDF Chat An Exponential Inequality for $U$-Statistics of I.I.D. Data 2021 Davide Giraudo
+ Concentration and Confidence for Discrete Bayesian Sequence Predictors 2013 Tor Lattimore
Marcus Hütter
Peter Sunehag
+ High-dimensional Berry-Esseen Bound for $m$-Dependent Random Samples 2022 Heejong Bong
Arun Kumar Kuchibhotla
Alessandro Rinaldo
+ An Effective Bernstein-type Bound on Shannon Entropy over Countably Infinite Alphabets 2021 Yunpeng Zhao
+ Maximum Likelihood Estimation of Functionals of Discrete Distributions 2014 Jiantao Jiao
Kartik Venkat
Yanjun Han
Tsachy Weissman
+ PDF Chat Generalization Bounds via Information Density and Conditional Information Density 2020 Fredrik Hellström
Giuseppe Durisi
+ Minimax Estimation of Discrete Distributions under $\ell_1$ Loss 2014 Yanjun Han
Jiantao Jiao
Tsachy Weissman
+ Dimensional Discrete Entropy Power Inequalities for Log-Concave Random Vectors 2024 Matthieu Fradelizi
Lampros Gavalakis
Martin Rapaport