A Characterization of Entropy in Terms of Information Loss

Type: Article

Publication Date: 2011-11-24

Citations: 95

DOI: https://doi.org/10.3390/e13111945

Abstract

There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the “information loss”, or change in entropy, associated with a measure-preserving function. Information loss is a special case of conditional entropy: namely, it is the entropy of a random variable conditioned on some function of that variable. We show that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous. This characterization naturally generalizes to Tsallis entropy as well.

Locations

  • Entropy - View - PDF
  • eScholarship (California Digital Library) - View - PDF
  • arXiv (Cornell University) - View - PDF
  • Edinburgh Research Explorer (University of Edinburgh) - View - PDF
  • Edinburgh Research Explorer (University of Edinburgh) - View - PDF
  • DataCite API - View

Similar Works

Action Title Year Authors
+ PDF Chat The Information Loss of a Stochastic Map 2021 James Fullwood
Arthur J. Parzygnat
+ PDF Chat On the Structure of Information 2024 Sebastian Gottwald
Daniel A. Braun
+ PDF Chat A two-parameter entropy and its fundamental properties 2020 Supriyo Dutta
Shigeru Furuichi
Partha Guha
+ PDF Chat A short characterization of relative entropy 2019 Tom Leinster
+ A short characterization of relative entropy 2017 Tom Leinster
+ A short characterization of relative entropy 2017 Tom Leinster
+ Elements of Generalized Tsallis Relative Entropy in Classical Information Theory 2019 Supriyo Dutta
Shigeru Furuichi
Partha Guha
+ PDF Chat Information Loss Measures 2014 Josep Domingo‐Ferrer
+ Information Loss Measures 2011
+ Information Loss Measures 2018 Josep Domingo‐Ferrer
+ PDF Chat Information Loss as a Foundational Principle for the Second Law of Thermodynamics 2007 Todd L. Duncan
J. S. Semura
+ Entropy and ergodic theory 1975 J. Aczél
+ Tsallis divergence and superadditivity 2021 Thomas Oikonomou
G. Baris Bagci
+ PDF Chat Integrability vs. information loss: a simple example 2006 Vijay Balasubramanian
Bartłomiej Czech
Klaus Larjo
Joan Simón
+ Entropy maximization 2009 Krishna B. Athreya
+ An entropy functional bounded from above by one 2022 John Çamkıran
+ PDF Chat Tsallis kernels on measures 2008 André F. T. Martins
Pedro M. Q. Aguiar
Mário A. T. Figueiredo
+ Information Theory and Statistics 2011 Evgueni Haroutunian
+ Information Theory and Statistics 1960 K. Prachar
S. Kullback
+ Information Theory and Statistics 2012