KL Divergence Explorer

Adjust two discrete probability distributions P and Q using the sliders. See D_KL(P||Q) and D_KL(Q||P) live — and why asymmetry matters.

D_KL(P||Q) bits
"P from Q"
D_KL(Q||P) bits
"Q from P"
JS divergence
(symmetric)
D_KL(P||Q) = Σ P(x) log₂(P(x)/Q(x))

Distribution P (blue)

Distribution Q (orange)

When Q(x)=0 but P(x)>0: D_KL(P||Q)=∞ (shown as very large). Jensen-Shannon divergence is always finite and symmetric.