Hebbian Learning — Hopfield Network Memory

A Hopfield network stores binary patterns as attractors via Hebb's rule: W_ij = Σ_μ ξ^μ_i ξ^μ_j / N. The network capacity is ~0.138N patterns. Given a noisy or partial input, asynchronous updates converge to the nearest stored memory, minimizing the Lyapunov energy E = -½ Σ W_ij s_i s_j.

Patterns stored: 0 / capacity: 0 | Energy: — | Overlap: —

Draw on the left grid to create patterns. Store them, add noise, then watch the network recall.