Version 1
: Received: 15 December 2018 / Approved: 18 December 2018 / Online: 18 December 2018 (03:47:32 CET)
Version 2
: Received: 6 March 2019 / Approved: 7 March 2019 / Online: 7 March 2019 (07:36:24 CET)
Huang, W.; Zhang, K. Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding. Entropy2019, 21, 243.
Huang, W.; Zhang, K. Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding. Entropy 2019, 21, 243.
Huang, W.; Zhang, K. Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding. Entropy2019, 21, 243.
Huang, W.; Zhang, K. Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding. Entropy 2019, 21, 243.
Abstract
Information theory is widely used in various disciplines, and effective calculation of Shannon mutual information is typically not an easy task for many practical applications, including problems of neural population coding in computational and theoretical neuroscience. Asymptotic formulas based on Fisher information may provide accurate approximations to mutual information but this approach is restricted to continuous variables because the calculation requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations based on Kullback-Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding, and these asymptotic formulas hold true for discrete variables as there is no requirement for differentiability. In particular, one of our approximation formulas has consistent performance and good accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating mutual information between the discrete variables or stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.
Computer Science and Mathematics, Discrete Mathematics and Combinatorics
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.