Article
Version 2
Preserved in Portico This version is not peer-reviewed

# On the Use of Entropy as a Measure of Dependence of Two Events

Version 1
: Received: 29 May 2021 / Approved: 3 June 2021 / Online: 3 June 2021 (11:21:29 CEST)

Version 2 : Received: 3 June 2021 / Approved: 3 June 2021 / Online: 3 June 2021 (13:09:25 CEST)

Version 3 : Received: 3 July 2021 / Approved: 5 July 2021 / Online: 5 July 2021 (16:22:30 CEST)

Version 2 : Received: 3 June 2021 / Approved: 3 June 2021 / Online: 3 June 2021 (13:09:25 CEST)

Version 3 : Received: 3 July 2021 / Approved: 5 July 2021 / Online: 5 July 2021 (16:22:30 CEST)

How to cite:
Iliev, V. On the Use of Entropy as a Measure of Dependence of Two Events. *Preprints* **2021**, 2021060100 (doi: 10.20944/preprints202106.0100.v2).
Iliev, V. On the Use of Entropy as a Measure of Dependence of Two Events. Preprints 2021, 2021060100 (doi: 10.20944/preprints202106.0100.v2).

## Abstract

We define degree of dependence of two events A and B in a probability space by using Boltzmann-Shannon entropy function of an appropriate probability distribution produced by these events and depending on one parameter (the probability of intersection of A and B) varying within a closed interval I. The entropy function attains its global maximum when the events A and B are independent. The important particular case of discrete uniform probability space motivates this definition in the following way. The entropy function has a minimum at the left endpoint of I exactly when one of the events and the complement of the other are connected with the relation of inclusion (maximal negative dependence). It has a minimum at the right endpoint of I exactly when one of these events is included in the other (maximal positive dependence). Moreover, the deviation of the entropy from its maximum is equal to average information that carries one of the binary trials defined by A and B with respect to the other. As a consequence, the degree of dependence of A and B can be expressed in terms of information theory and is invariant with respect to the choice of unit of information. Using this formalism, we describe completely the screening tests and their reliability, measure efficacy of a vaccination, the impact of some events from the financial markets to other events, etc.

## Keywords

entropy; average information; degree of dependence; probability space; probability distribution; experiment in a sample space; linear system; affine isomorphism; classification space.

## Subject

MATHEMATICS & COMPUTER SCIENCE, Algebra & Number Theory

Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

## Comments (1)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our diversity statement.

Leave a public commentSend a private comment to the author(s)

Commenter: Valentin Iliev

Commenter's Conflict of Interests: Author

1) One typo in Row 9: H'_{\alpha,\beta}(\theta)=E_{\alpha,\beta}(\theta)=

must be E'_{\alpha,\beta}(\theta)=

2) Two typos in Row 10: H'_{\alpha,\beta}(\theta)=

must be E'_{\alpha,\beta}(\theta)=

3) Added are more keywords.