Version 1
: Received: 25 May 2023 / Approved: 26 May 2023 / Online: 26 May 2023 (10:00:47 CEST)
How to cite:
Jwo, D.; Cho, T.S.; Biswal, A. A Geometric Interpretation of the Multivariate Gaussian Distribution and its Entropy and Mutual Information. Preprints2023, 2023051909. https://doi.org/10.20944/preprints202305.1909.v1
Jwo, D.; Cho, T.S.; Biswal, A. A Geometric Interpretation of the Multivariate Gaussian Distribution and its Entropy and Mutual Information. Preprints 2023, 2023051909. https://doi.org/10.20944/preprints202305.1909.v1
Jwo, D.; Cho, T.S.; Biswal, A. A Geometric Interpretation of the Multivariate Gaussian Distribution and its Entropy and Mutual Information. Preprints2023, 2023051909. https://doi.org/10.20944/preprints202305.1909.v1
APA Style
Jwo, D., Cho, T.S., & Biswal, A. (2023). A Geometric Interpretation of the Multivariate Gaussian Distribution and its Entropy and Mutual Information. Preprints. https://doi.org/10.20944/preprints202305.1909.v1
Chicago/Turabian Style
Jwo, D., Ta- Shun Cho and Amita Biswal. 2023 "A Geometric Interpretation of the Multivariate Gaussian Distribution and its Entropy and Mutual Information" Preprints. https://doi.org/10.20944/preprints202305.1909.v1
Abstract
The fundamental objective is to study the application of multivariate sets of data in Gaussian distribution. This paper examines broad measurements of structure for both Gaussian and non-Gaussian distributions, which shows that they can be described in terms of the infor-mation-theoretic between the given covariance matrix and correlated random variables (in terms of relative entropy). In order to develop the multivariate Gaussian distribution with entropy and mutual information, several significant methodologies are presented through the discussion supported by illustrations, both technically and statistically. The content obtained allows readers to better perceive concepts, comprehend techniques, and properly execute software programs for future study on the topic's science and implementations. It also helps readers grasp the themes' fundamental concepts. Involving the relative entropy and mutual information as well as the potential correlated covariance analysis based on differential equations, a wide range of information is addressed, including basic to application concerns.
Keywords
multivariate Gaussians; correlated random variables; visualization; entropy; relative entropy; mutual information
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.