A characterization of entropy in terms of information loss

Baez, J., Fritz, T. and Leinster, T. (2011) A characterization of entropy in terms of information loss. Entropy, 13(11), pp. 1945-1957. (doi: 10.3390/e13111945)

Full text not currently available from Enlighten.

Abstract

There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the “information loss”, or change in entropy, associated with a measure-preserving function. Information loss is a special case of conditional entropy: namely, it is the entropy of a random variable conditioned on some function of that variable. We show that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous. This characterization naturally generalizes to Tsallis entropy as well.

Item Type:Articles
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Leinster, Dr Tom
Authors: Baez, J., Fritz, T., and Leinster, T.
Subjects:Q Science > QA Mathematics
College/School:College of Science and Engineering > School of Mathematics and Statistics > Mathematics
Journal Name:Entropy
ISSN:1099-4300
ISSN (Online):1099-4300
Published Online:24 November 2011
Related URLs:

University Staff: Request a correction | Enlighten Editors: Update this record

Project CodeAward NoProject NamePrincipal InvestigatorFunder's NameFunder RefLead Dept
424771Self similarity - recursively definable objects in topology, analysis category theory and algebraThomas LeinsterEngineering & Physical Sciences Research Council (EPSRC)EP/D073537/1Mathematics