Hidden Markov Entropy Rate

Blackwell measure, excess entropy, and the information in HMM outputs

Entropy Rate h(X)
Excess Entropy E
Current belief p₀
Output symbol
Hidden Markov Model: states {0,1} are hidden; we observe emissions {A,B}. The Blackwell measure is the stationary distribution over belief states (posterior prob. of being in state 0) induced by the HMM. The entropy rate is:
h(X) = -∫ μ(dp) Σs P(s|p) log P(s|p)
where μ is the Blackwell measure (stationary measure on belief simplex). The excess entropy E measures how much the past tells you about the future — it equals the mutual information I(past;future). The histogram shows the empirical Blackwell measure over belief state p₀.