Equivocations, Exponents, and Second-Order Coding Rates Under Various Rényi Information Measures

Type: Article

Publication Date: 2016-12-06

Citations: 28

DOI: https://doi.org/10.1109/tit.2016.2636154

Abstract

We evaluate the asymptotics of equivocations, their exponents as well as their second-order coding rates under various Rényi information measures. Specifically, we consider the effect of applying a hash function on a source and we quantify the level of non-uniformity and dependence of the compressed source from another correlated source when the number of copies of the sources is large. Unlike previous works that use Shannon information measures to quantify randomness, information, or uniformity, we define our security measures in terms of a more general class of information measures-the Rényi information measures and their Gallager-type counterparts. A special case of these Rényi information measure is the class of Shannon information measures. We prove tight asymptotic results for the security measures and their exponential rates of decay. We also prove bounds on the second-order asymptotics and show that these bounds match when the magnitudes of the second-order coding rates are large. We do so by establishing new classes non-asymptotic bounds on the equivocation and evaluating these bounds using various probabilistic limit theorems asymptotically.

Locations

  • arXiv (Cornell University) - View - PDF
  • IEEE Transactions on Information Theory - View

Similar Works

Action Title Year Authors
+ Equivocations, Exponents and Second-Order Coding Rates under Various Rényi Information Measures 2015 Masahito Hayashi
Vincent Y. F. Tan
+ PDF Chat Analysis of Remaining Uncertainties and Exponents Under Various Conditional Rényi Entropies 2018 Vincent Y. F. Tan
Masahito Hayashi
+ Analysis of Remaining Uncertainties and Exponents under Various Conditional Rényi Entropies 2016 Vincent Y. F. Tan
Masahito Hayashi
+ Remaining uncertainties and exponents under Rényi information measures 2016 Masahito Hayashi
Vincent Y. F. Tan
+ Entropy of Independent Experiments, Revisited 2017 Maciej Skórski
+ The Complexity of Estimating Rényi Entropy 2014 Jayadev Acharya
Alon Orlitsky
Ananda Theertha Suresh
Himanshu Tyagi
+ PDF Chat Non-Asymptotic Fundamental Limits of Guessing Subject to Distortion 2019 Shota Saito
Toshiyasu Matsushima
+ PDF Chat Tight Bounds on the Rényi Entropy via Majorization with Applications to Guessing and Compression 2018 Igal Sason
+ PDF Chat Strong Converse Exponent for Remote Lossy Source Coding 2025 Han Wu
Hamdi Joudeh
+ Relative entropies and their use in quantum information theory 2016 Felix Leditzky
+ Non-Asymptotic Fundamental Limits of Guessing Subject to Distortion 2018 Shota Saito
Toshiyasu Matsushima
+ Tight Bounds on the R\'enyi Entropy via Majorization with Applications to Guessing and Compression 2018 Igal Sason
+ PDF Chat Source coding with escort distributions and Rényi entropy bounds 2009 Jean‐François Bercher
+ Tight Bounds on the Rényi Entropy via Majorization with Applications to Guessing and Compression 2018 Igal Sason
+ PDF Chat R\'enyi divergence guarantees for hashing with linear codes 2024 Madhura Pathegama
Alexander Barg
+ PDF Chat On the Efficient Estimation of Min-Entropy 2021 Yongjune Kim
Cyril Guyot
Young Sik Kim
+ Rényi Resolvability and Its Applications to the Wiretap Channel 2017 Lei Yu
Vincent Y. F. Tan
+ PDF Chat Information Leakage in Zero-Error Source Coding: A Graph-Theoretic Perspective 2021 Yucheng Liu
Lawrence Ong
Sarah J. Johnson
Jörg Kliewer
Parastoo Sadeghi
Phee Lep Yeoh
+ PDF Chat Smoothing Linear Codes by R\'enyi Divergence and Applications to Security Reduction 2024 Yan Hao
Cong Ling
+ PDF Chat R\'enyi divergence-based uniformity guarantees for $k$-universal hash functions 2024 Madhura Pathegama
Alexander Barg