Rényi Entropy & Bipartite Mutual Information
Order-α entropy, entanglement measures & information geometry
H₁ = —
Hα = —
I(A;B) = —
H_max = —
Rényi entropy Hα(X) = (1/(1-α)) log Σ piα.
α → 1 gives Shannon entropy; α = 2 is collision entropy; α → ∞ is min-entropy.
Mutual information I(A;B) = H(A) + H(B) − H(A,B) quantifies bipartite correlations.
The entropy cone constrains valid (H_A, H_B, H_{AB}) triples.