Submitted:
24 August 2023
Posted:
24 August 2023
You are already at the latest version
Abstract
Keywords:
1. Introduction
1.1. Space-time random field
1.2. Kramers-Moyal expansion and Fokker-Planck equation
1.3. Differential entropy and De Bruijn identity
1.4. Entropy Divergence
2. Notations, Definitions and Propositions
2.1. Notations and Assumptions
- Given a probability space , two real valued space-time random fields are denoted as , or , , where and , , are space-time variables.
- The probability density functions of P and Q are denoted as p and q. With , is the density value at of X and is the density value at of Y.
- Suppose that our density functions , i.e. and are derivable twice with respect to u and once with respect to or , respectively.
- In this paper, we denote that only the k-th coordinate differs in the d-dimensional real vectors as and , where the k-th coordinates are and , .
2.2. Definitions
3. Main Results and Proofs
4. Three Fokker-Planck Random Fields and Their Corresponding Information Measures
4.1. A Trivial Equation
4.2. An Nontrivial Equation
4.3. An Interesting Equation
5. Conclusions
Acknowledgments
Conflicts of Interest
Abbreviations
| KL | Kullback-Leibler divergence |
| FI | Fisher information |
| CFI | Cross Fisher information |
| FD | Fisher divergence |
| sFD | symmetric Fisher divergence |
| JD | Jeffreys divergence |
References
- Risken, H. The Fokker-Planck equation: methods of solution and applications; Springer Heidelberg: Berlin, 1984. [Google Scholar]
- Stam, A. J. Some inequalities satisfied by the quantities of information of Fisher and Shannon. Inf. Control 1959, 2, 101–112. [Google Scholar] [CrossRef]
- Barron, A. R. Entropy and the central limit theorem. Ann. Probab. 1986, 14, 336–342. [Google Scholar] [CrossRef]
- Johnson, O. Information theory and the central limit theorem; Imperial college Press: London, U.K., 2004. [Google Scholar]
- Guo, D. Relative entropy and score function: New information estimation relationships through arbitrary additive perturbation. Proc. IEEE Int. Symp. Inf. Theory, Seoul, South Korea, Jun./Jul. 2009; pp. 814–818.
- Toranzo, I. V.; Zozor, S.; Brossier, J-M. Generalization of the De Bruijn Identity to General ϕ-Entropies and ϕ-Fisher Informations. IEEE Trans. Inform. Theory 2018, 64, 6743–6758. [Google Scholar] [CrossRef]
- Kharazmi, O.; Balakrishnan, N. Cumulative residual and relative cumulative residual Fisher information and their properties. IEEE Trans. Inform. Theory 2021, 67, 6306–6312. [Google Scholar] [CrossRef]
- Kolmogorov, A. N. The local structure of turbulence in incompressible viscous fluid for very large Reynolds numbers. Dokl. Akad. Nauk SSSR 1941, 30, 299–303. [Google Scholar]
- Kolmogorov, A. N. On the degeneration of isotropic turbulence in an incompressible viscous flu. Dokl. Akad. Nauk SSSR 1941, 31, 538–542. [Google Scholar]
- Kolmogorov, A. N. Dissipation of energy in isotropic turbulence. Dokl. Akad. Nauk SSSR 1941, 32, 19–21. [Google Scholar]
- Yaglom, A. M. Some classes of random fields in n-dimensional space, related to stationary random processes. Theory Probab. Its Appl. 1957, 2, 273–320. [Google Scholar] [CrossRef]
- Yaglom, A. M. Correlation theory of stationary and related Random functions. Volume I: basic results; Springer-Verlag: New York, 1987. [Google Scholar]
- Yaglom, A. M. Correlation theory of stationary and related random functions. Volume II: supplementary notes and references; Springer-Velag: Berlin, 1987. [Google Scholar]
- Bowditch, A.; Sun, R. The two-dimensional continuum random field Ising model. Ann. Probab. 2022, 50, 419–454. [Google Scholar] [CrossRef]
- Bailleul, I.; Catellier, R.; Delarue, F. Propagation of chaos for mean field rough differential equations. Ann. Probab. 2021, 49, 944–996. [Google Scholar] [CrossRef]
- Wu, L.; Samorodnitsky, G. Regularly varying random fields. Stoch. Process Their. Appl. 2020, 130, 4470–4492. [Google Scholar] [CrossRef]
- Koch, E.; Dombry, C.; Robert, C. Y. A central limit theorem for functions of stationary max-stable random fields on Rd. Stoch. Process Their. Appl. 2020, 129, 3406–3430. [Google Scholar] [CrossRef]
- Ye, Z. On entropy and ε-entropy of random fields. Ph.D.dissertation, Cornell University, 1989. [Google Scholar]
- Ye, Z.; Berger, T. A new method to estimate the critical distortion of random fields. IEEE Trans. Inform. Theory 1992, 38, 152–157. [Google Scholar] [CrossRef]
- Ye, Z.; Berger, T. Information Measures for Discrete Random Fields; Science Press: Beijing/New York, 1998. [Google Scholar]
- Ye, Z.; Yang, W. Random Field: Network Information Theory and Game Theory; Science Press: Beijing, 2023. (in Chinese) [Google Scholar]
- Ma, C. Stationary random fields in space and time with rational spectral densities. IEEE Trans. Inform. Theory 2007, 53, 1019–1029. [Google Scholar] [CrossRef]
- Hairer, M. A theory of regularity structures. Invent. Math. 2014, 198, 269–504. [Google Scholar] [CrossRef]
- Hairer, M. Solving the KPZ equation. Ann. Math. 2013, 178, 559–664. [Google Scholar] [CrossRef]
- Kremp, H.; Perkowski, N. Multidimensional SDE with distributional drift and Lévy noise. Bernoulli 2022, 28, 1757–1783. [Google Scholar] [CrossRef]
- Beeson, R.; Namachchivaya, N. S.; Perkowski, N. Approximation of the filter equation for multiple timescale, correlated, nonlinear systems. SIAM J. Math. Anal. 2022, 54(3), 3054–3090. [Google Scholar] [CrossRef]
- Song, Z.; Zhang, J. A note for estimation about average differential entropy of continuous bounded space-time random field. Chinese J. Electron 2022, 31, 793–803. [Google Scholar] [CrossRef]
- Kramers, H. A. Brownian motion in a field of force and the diffusion model of chemical reactions. Physica 1940, 7, 284–304. [Google Scholar] [CrossRef]
- Moyal, J. E. Stochastic processes and statistical physics. J R Stat Soc Series B Stat Methodol 1949, 11, 150–210. [Google Scholar] [CrossRef]
- Shannon, C. E. A mathematical theory of communication. Bell System Technical Journal 1948, 27, 379–423, 623–656. [Google Scholar] [CrossRef]
- Neeser, F. D.; Massey, J. L. Proper complex random processes with applications to information theory. IEEE Trans. Inform. Theory 1991, 39, 1293–1302. [Google Scholar] [CrossRef]
- Ihara, S. Information theory-for continuous systems; World Scientific: Singapore, 1993. [Google Scholar]
- Gray, R. M. Entropy and information theory; Springer: Boston, 2011. [Google Scholar]
- Bach, F. Information Theory With Kernel Methods. IEEE Trans. Inform. Theory 2023, 69, 752–775. [Google Scholar] [CrossRef]
- Kullback, S.; Leibler, R. A. On information and sufficiency, Ann. Math. Stat. 1951, 22, 79–86. [Google Scholar] [CrossRef]
- Jeffreys, H. An invariant form for the prior probability in estimation problems. Proc. R. Soc. Lond. A 1946, 186, 453–461. [Google Scholar]
- Fuglede, B.; Topsøe, F. Jensen-Shannon divergence and Hilbert space embedding. In Proceedings of the IEEE International Symposium on Information Theory (ISIT), Chicago, IL, USA, 27 June-2 July 2004; p. 31. [Google Scholar]
- Rényi, A. On measures of entropy and information. Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, 1961, 1, 547–561. [Google Scholar]
- She, R.; Fan, P.; Liu, X.-Y.; Wang, X. Interpretable Generative Adversarial Networks With Exponential Function. IEEE Trans. Signal Process. 2021, 69, 3854–3867. [Google Scholar] [CrossRef]
- Liu, S.; She, R.; Zhu, Z.; Fan, P. Storage Space Allocation Strategy for Digital Data with Message Importance. Entropy 2020, 22, 591. [Google Scholar] [CrossRef]
- She, R.; Liu, S.; Fan, P. Attention to the Variation of Probabilistic Events: Information Processing with Message Importance Measure. Entropy 2019, 21, 439. [Google Scholar] [CrossRef]
- Wan, S.; Lu, J.; Fan, P.; Letaief, K.B. Information Theory in Formation Control: An Error Analysis to Multi-Robot Formation. Entropy 2018, 20, 618. [Google Scholar] [CrossRef]
- She, R.; Liu, S.; Fan, P. Recognizing Information Feature Variation: Message Importance Transfer Measure and Its Applications in Big Data. Entropy 2018, 20, 401. [Google Scholar] [CrossRef] [PubMed]
- Nielsen, F. An Elementary Introduction to Information Geometry. Entropy 2020, 22, 1100. [Google Scholar] [CrossRef] [PubMed]
- Nielsen, F. On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means. Entropy 2019, 21, 485. [Google Scholar] [CrossRef] [PubMed]
- Nielsen, F.; Nock, R. Generalizing skew Jensen divergences and Bregman divergences with comparative convexity. IEEE Signal Process. Lett. 2017, 24, 1123–1127. [Google Scholar] [CrossRef]
- Furuichi, S.; Minculete, N. Refined Young Inequality and Its Application to Divergences. Entropy 2021, 23, 514. [Google Scholar] [CrossRef]
- Pinele, J.; Strapasson, J.E.; Costa, S.I. The Fisher-Rao Distance between Multivariate Normal Distributions: Special Cases, Bounds and Applications. Entropy 2020, 22, 404. [Google Scholar] [CrossRef]
- Reverter, F.; Oller, J.M. Computing the Rao distance for Gamma distributions. J. Comput. Appl. Math. 2003, 157, 155–167. [Google Scholar] [CrossRef]
- Pawula, R. F. Generalizations and extensions of the Fokker-Planck-Kolmogorov equations. IEEE Trans. Inform. Theory 1967, 13, 33–41. [Google Scholar] [CrossRef]
- Pawula, R. F. Approximation of the linear Boltzmann equation by the Fokker-Planck equation. Phys. Rev. 1967, 162, 186–188. [Google Scholar] [CrossRef]
- Khoshnevisan, D.; Shi, Z. Brownian Sheet and Capacity, Ann. Probab. 1999, 27, 3, 1135–1159. [Google Scholar] [CrossRef]
- Daniel, R.; Marc, Y. Continuous Martingales and Brownian Motion, 2nd. ed; Springer-Verlag: New York, 1999. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2020 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
