Mutual Information & Gaussian Channel Capacity

Shannon's information theory — how much can a noisy channel reliably convey?

Signal Power P (SNR numerator)5.0
Noise Power N1.0
Bandwidth W (Hz)5 Hz

Channel Statistics

SNR
SNR (dB)
Capacity C
I(X;Y)
H(X)
H(Y)

Shannon Limit

C = W·log₂(1+P/N)

The Shannon-Hartley theorem gives the channel capacity C = W·log₂(1+SNR) bits/second — the theoretical maximum error-free data rate. Mutual information I(X;Y) = H(Y) − H(Y|X) measures shared information between input X and output Y. For Gaussian channels, capacity is achieved by Gaussian inputs. The green region shows the joint distribution p(x,y); blue is the input marginal; red is the noise distribution.