Ince, R. A.A. (2017) Measuring multivariate redundant information with pointwise common change in surprisal. Entropy, 19(7), 318. (doi: 10.3390/e19070318)
|
Text
149832.pdf - Published Version Available under License Creative Commons Attribution. 601kB |
Abstract
The problem of how to properly quantify redundant information is an open question that has been the subject of much recent research. Redundant information refers to information about a target variable S that is common to two or more predictor variables Xi . It can be thought of as quantifying overlapping information content or similarities in the representation of S between the Xi . We present a new measure of redundancy which measures the common change in surprisal shared between variables at the local or pointwise level. We provide a game-theoretic operational definition of unique information, and use this to derive constraints which are used to obtain a maximum entropy distribution. Redundancy is then calculated from this maximum entropy distribution by counting only those local co-information terms which admit an unambiguous interpretation as redundant information. We show how this redundancy measure can be used within the framework of the Partial Information Decomposition (PID) to give an intuitive decomposition of the multivariate mutual information into redundant, unique and synergistic contributions. We compare our new measure to existing approaches over a range of example systems, including continuous Gaussian variables. Matlab code for the measure is provided, including all considered examples.
Item Type: | Articles |
---|---|
Status: | Published |
Refereed: | Yes |
Glasgow Author(s) Enlighten ID: | Ince, Dr Robin |
Authors: | Ince, R. A.A. |
College/School: | College of Medical Veterinary and Life Sciences > School of Psychology & Neuroscience |
Journal Name: | Entropy |
Publisher: | MDPI |
ISSN: | 1099-4300 |
ISSN (Online): | 1099-4300 |
Published Online: | 29 June 2017 |
Copyright Holders: | Copyright © 2017 The Author |
First Published: | First published in Entropy 19(7): 318 |
Publisher Policy: | Reproduced under a Creative Commons license |
University Staff: Request a correction | Enlighten Editors: Update this record