Information Entropy

Shannon entropy explorer — adjust distributions, see entropy and mutual information

Distribution
Channel Capacity
Entropy vs p
H(X) bits
H_max bits
Efficiency
Extra metric

Shannon entropy H(X) = -Σ p(x)·log₂p(x) measures average information per symbol. Maximum entropy occurs for the uniform distribution. Drag bars to adjust probabilities.

Shannon entropy H(X) = -Σ p(x)·log₂p(x) measures average information per symbol. Maximum entropy = log₂(n) bits occurs for the uniform distribution. Drag bars to reshape the distribution.