Information-theoretic bounds and approximations in neural population coding

Wentao Huang, Kechen Zhang

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

While Shannon's mutual information has widespread applications in many disciplines, for practical applications it is often difficult to calculate its value accurately for high-dimensional variables because of the curse of dimensionality. This article focuses on effective approximation methods for evaluating mutual information in the context of neural population coding. For large but finite neural populations, we derive several information-theoretic asymptotic bounds and approximation formulas that remain valid in high-dimensional spaces. We prove that optimizing the population density distribution based on these approximation formulas is a convex optimization problem that allows efficient numerical solutions. Numerical simulation results confirmed that our asymptotic formulas were highly accurate for approximating mutual information for large neural populations. In special cases, the approximation formulas are exactly equal to the true mutual information. We also discuss techniques of variable transformation and dimensionality reduction to facilitate computation of the approximations.

Original languageEnglish (US)
Pages (from-to)885-944
Number of pages60
JournalNeural Computation
Volume30
Issue number4
DOIs
StatePublished - Apr 1 2018

ASJC Scopus subject areas

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience

Fingerprint

Dive into the research topics of 'Information-theoretic bounds and approximations in neural population coding'. Together they form a unique fingerprint.

Cite this