Preprint
Article

This version is not peer-reviewed.

Towards a Framework for Observational Causality from Time Series: When Shannon Meets Turing

A peer-reviewed article of this preprint also exists.

Submitted:

09 January 2020

Posted:

11 January 2020

You are already at the latest version

Abstract
We propose a novel tensor-based formalism for inferring causal structures from time series. An information theoretical analysis of transfer entropy (TE), shows that TE results from transmission of information over a set of communication channels. Tensors are the mathematical equivalents of these multi-channel causal channels. A multi-channel causal channel is a generalization of a discrete memoryless channel (DMC). We consider a DMC as a single-channel causal channel. Investigation of a system comprising three variables shows that in our formalism, bivariate analysis suffices to differentiate between direct and indirect relations. For this to be true, we have to combine the output of multi-channel causal channels with the output of single-channel causal channels. We can understand this result when we consider the role of noise. Subsequent transmission of information over noisy channels can never result in less noisy transmission overall. This implies that a Data Processing Inequality (DPI) exists for transfer entropy.
Keywords: 
;  ;  ;  ;  ;  
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated