MUTUAL INFORMATION & CHANNEL CAPACITY

How much information does output Y tell you about input X? Shannon's channel capacity is the maximum mutual information over all input distributions.

Crossover prob. ε0.10
Input bias p(X=1)0.50
Channel size4

I(X;Y) = H(Y) − H(Y|X) = ΣΣ p(x,y) log[p(x,y)/(p(x)p(y))]. BSC capacity C = 1 − H_b(ε) bits. Water-filling: allocate more power to cleaner channels. Shannon's noisy channel coding theorem: reliable transmission possible iff R < C.