Shannon entropy explorer — adjust distributions, see entropy and mutual information
Shannon entropy H(X) = -Σ p(x)·log₂p(x) measures average information per symbol. Maximum entropy occurs for the uniform distribution. Drag bars to adjust probabilities.
Shannon entropy H(X) = -Σ p(x)·log₂p(x) measures average information per symbol. Maximum entropy = log₂(n) bits occurs for the uniform distribution. Drag bars to reshape the distribution.