We propose a partial information decomposition based on the newly introduced framework of causal tensors, i.e., multilinear stochastic maps that transform source data into destination data. The innovation that causal tensors introduce is that the framework allows for an exact expression of an indirect association in terms of the constituting, direct associations. This is not possible when expressing associations only in measures like mutual information or transfer entropy. Instead of a priori expressing associations in terms of mutual information or transfer entropy, the a posteriori expression of associations in these terms results in an intuitive definition of a nonnegative and left monotonic redundancy. The proposed redundancy satisfies the three axioms introduced by William and Beer. The symmetry and self-redundancy axioms follow directly from our definition. The data processing inequality ensures that the monotonicity axiom is satisfied. Because causal tensors can be used to describe both mutual information as transfer entropy, the partial information decomposition applies to both measures. Results show that the decomposition closely resembles the decomposition of other another approach that expresses associations in terms of mutual information a posteriori.