Approximations of Shannon mutual information for discrete variables with applications to neural population coding

Wentao Huang, Kechen Zhang

Research output: Contribution to journalArticle

Abstract

Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback-Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.

Original languageEnglish (US)
Article number243
JournalEntropy
Volume21
Issue number3
DOIs
StatePublished - Mar 1 2019

Fingerprint

coding
approximation
Fisher information
divergence
information theory
stimuli
simulation

Keywords

  • Approximation
  • Chernoff divergence
  • Discrete variables
  • Kullback-Leibler divergence
  • Mutual information
  • Neural population coding
  • Rényi divergence

ASJC Scopus subject areas

  • Physics and Astronomy(all)

Cite this

Approximations of Shannon mutual information for discrete variables with applications to neural population coding. / Huang, Wentao; Zhang, Kechen.

In: Entropy, Vol. 21, No. 3, 243, 01.03.2019.

Research output: Contribution to journalArticle

@article{982b29019ca347b5a97b9e04a9c42102,
title = "Approximations of Shannon mutual information for discrete variables with applications to neural population coding",
abstract = "Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback-Leibler divergence and R{\'e}nyi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.",
keywords = "Approximation, Chernoff divergence, Discrete variables, Kullback-Leibler divergence, Mutual information, Neural population coding, R{\'e}nyi divergence",
author = "Wentao Huang and Kechen Zhang",
year = "2019",
month = "3",
day = "1",
doi = "10.3390/e21030243",
language = "English (US)",
volume = "21",
journal = "Entropy",
issn = "1099-4300",
publisher = "Multidisciplinary Digital Publishing Institute (MDPI)",
number = "3",

}

TY - JOUR

T1 - Approximations of Shannon mutual information for discrete variables with applications to neural population coding

AU - Huang, Wentao

AU - Zhang, Kechen

PY - 2019/3/1

Y1 - 2019/3/1

N2 - Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback-Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.

AB - Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback-Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.

KW - Approximation

KW - Chernoff divergence

KW - Discrete variables

KW - Kullback-Leibler divergence

KW - Mutual information

KW - Neural population coding

KW - Rényi divergence

UR - http://www.scopus.com/inward/record.url?scp=85063571455&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85063571455&partnerID=8YFLogxK

U2 - 10.3390/e21030243

DO - 10.3390/e21030243

M3 - Article

AN - SCOPUS:85063571455

VL - 21

JO - Entropy

JF - Entropy

SN - 1099-4300

IS - 3

M1 - 243

ER -