Explore generalized entropy families. Drag the distribution sliders to shape the probability distribution, and watch how Rényi (order q) and Tsallis (index q) compare to Shannon entropy.
Rényi entropy: H_q = log(Σpᵢq)/(1-q). Tsallis entropy: S_q = (1-Σpᵢq)/(q-1). Both → Shannon entropy as q→1. Rényi q=0: log(support size); q=2: collision entropy; q=∞: min-entropy. Used in multifractals, anomalous diffusion (Tsallis), and quantum information.