Transfer Entropy

Quantifying directed information flow between time series
TE(X→Y): —
TE(Y→X): —
Net flow: —
TE(X→Y) =
H(Y_t|Y_{t-k}) −
H(Y_t|Y_{t-k},X_{t-k})
Transfer entropy (Schreiber 2000) measures directed information flow: how much knowing X's past reduces uncertainty about Y's future, beyond what Y's own past provides. It is asymmetric — TE(X→Y) ≠ TE(Y→X) — capturing the direction of causal influence. Unlike cross-correlation, TE detects nonlinear coupling and distinguishes driver from responder. Applications include neuroscience (effective connectivity), finance (market influence), and climate (teleconnections). TE = 0 does not imply independence; it implies predictive irrelevance of X for Y given Y's past.