We propose a novel tensor-based formalism for inferring causal structures from time series. An information theoretical analysis of transfer entropy (TE), shows that TE results from transmission of information over a set of communication channels. Tensors are the mathematical equivalents of these multi-channel causal channels. A multi-channel causal channel is a generalization of a discrete memoryless channel (DMC). We consider a DMC as a single-channel causal channel. Investigation of a system comprising three variables shows that in our formalism, bivariate analysis suffices to differentiate between direct and indirect relations. For this to be true, we have to combine the output of multi-channel causal channels with the output of single-channel causal channels. We can understand this result when we consider the role of noise. Subsequent transmission of information over noisy channels can never result in less noisy transmission overall. This implies that a Data Processing Inequality (DPI) exists for transfer entropy.