Hopfield Network: Memory & Capacity

A Hopfield network stores patterns as attractors via Hebbian learning: W = (1/N)Σ_μ ξ^μ ξ^μᵀ. Capacity limit ~0.138N patterns (Amit-Gutfreund-Sompolinsky). Beyond this, memories become confused.

Stored patterns

Recall (noisy probe)

After convergence

W_ij = (1/N)Σ_μ ξ_i^μ ξ_j^μ | sᵢ(t+1) = sgn(Σⱼ Wᵢⱼsⱼ) | C ≈ 0.138N
Click "New Patterns" to store, then "Recall" to test retrieval.