Preprint Article Version 2 Preserved in Portico This version is not peer-reviewed

Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding

Version 1 : Received: 15 December 2018 / Approved: 18 December 2018 / Online: 18 December 2018 (03:47:32 CET)
Version 2 : Received: 6 March 2019 / Approved: 7 March 2019 / Online: 7 March 2019 (07:36:24 CET)

A peer-reviewed article of this Preprint also exists.

Huang, W.; Zhang, K. Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding. Entropy 2019, 21, 243. Huang, W.; Zhang, K. Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding. Entropy 2019, 21, 243.

Abstract

Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback--Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.

Keywords

neural population coding; mutual information; Kullback-Leibler divergence; Rényi divergence; Chernoff divergence; approximation; discrete variables

Subject

Computer Science and Mathematics, Discrete Mathematics and Combinatorics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.