Information Theory
Mutual information measures shared information between X and Y. Transfer entropy (Schreiber 2000) captures directional information flow — the reduction in uncertainty of Yₜ given past Y, when knowing past X.
I(X;Y) = H(X)+H(Y)-H(X,Y)
T_{X→Y} = H(Yₜ|Yₜ₋₁) - H(Yₜ|Yₜ₋₁,Xₜ₋₁)
T_{X→Y} ≠ T_{Y→X} in general
T_{X→Y} = H(Yₜ|Yₜ₋₁) - H(Yₜ|Yₜ₋₁,Xₜ₋₁)
T_{X→Y} ≠ T_{Y→X} in general
Signal Settings
Causal Structure
X
→
Y
0.00
MI(X;Y)
0.00
H(X) bits
0.00
T(X→Y)
0.00
T(Y→X)