Preprint Article Version 3 Preserved in Portico This version is not peer-reviewed

A Partial Information Decomposition Based on Causal Tensors

Version 1 : Received: 4 February 2020 / Approved: 5 February 2020 / Online: 5 February 2020 (12:43:30 CET)
Version 2 : Received: 7 February 2020 / Approved: 10 February 2020 / Online: 10 February 2020 (09:11:51 CET)
Version 3 : Received: 26 February 2020 / Approved: 27 February 2020 / Online: 27 February 2020 (10:55:05 CET)

How to cite: Sigtermans, D. A Partial Information Decomposition Based on Causal Tensors. Preprints 2020, 2020020066. https://doi.org/10.20944/preprints202002.0066.v3 Sigtermans, D. A Partial Information Decomposition Based on Causal Tensors. Preprints 2020, 2020020066. https://doi.org/10.20944/preprints202002.0066.v3

Abstract

We propose a partial information decomposition based on the newly introduced framework of causal tensors, i.e., multilinear stochastic maps that transform source data into destination data. The innovation that causal tensors introduce is that the framework allows for an exact expression of an indirect association in terms of the constituting, direct associations. This is not possible when expressing associations only in measures like mutual information or transfer entropy. Instead of a priori expressing associations in terms of mutual information or transfer entropy, the a posteriori expression of associations in these terms results in an intuitive definition of a nonnegative and left monotonic redundancy, which also meets the identity property. Our proposed redundancy satisfies the three axioms introduced by Williams and Beer. Symmetry and self-redundancy axioms follow directly from our definition. The data processing inequality ensures that the monotonicity axiom is satisfied. Because causal tensors can describe both mutual information as transfer entropy, the partial information decomposition applies to both measures. Results show that the decomposition closely resembles the decomposition of other another approach that expresses associations in terms of mutual information a posteriori. A negative synergistic term could indicate that there is an unobserved common cause.

Keywords

information theory; causal inference; causal tensors; transfer entropy; partial information decomposition; left monotonicity; identity property; unobserved common cause

Subject

Physical Sciences, Thermodynamics

Comments (1)

Comment 1
Received: 27 February 2020
Commenter: David Sigtermans
Commenter's Conflict of Interests: Author
Comment: Some minor mistakes were corrected. Additionally, the last version still had an incorrect proof that was supposed to be replaced with a correct proof: the wrong version was uploaded
+ Respond to this comment

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 1
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.