Rényi Entropy & Quantum Information Spectrum

Hα(X) = (1/(1−α)) log Σ pᵢᵅ — interpolating between min-entropy and max-entropy
Rényi entropy Hα(X) = (1/(1−α)) log₂(Σ pᵢᵅ) unifies information measures. α→0: max-entropy H₀ = log|support|. α→1: Shannon entropy H₁ = −Σpᵢlog pᵢ. α→2: collision entropy. α→∞: min-entropy H = −log max pᵢ. Rényi divergence Dα(P‖Q) = (1/(α−1)) log Σ pᵢᵅ qᵢ^(1−α). Quantum: replace eigenvalues of density matrix ρ; min-entropy governs single-shot decoupling.