Hopfield Network
A Hopfield network is a recurrent neural network that acts as associative memory. Train it on patterns using Hebb's rule, then present a corrupted or partial version and watch it converge to the nearest stored memory — climbing down an energy landscape to the closest attractor.
wᵢⱼ = (1/N) Σμ sμᵢ sμⱼ | sᵢ ← sgn(Σⱼ wᵢⱼ sⱼ) | E = −(1/2) Σᵢⱼ wᵢⱼ sᵢ sⱼ
Click to flip pixels. Draw a pattern, then store it.
Store up to 5 patterns. Click a stored memory thumbnail to load a corrupted version into the network state, then hit Recall to watch it converge.
Associative memory and energy landscapes
John Hopfield introduced this network in 1982 as a physical model of memory, inspired by spin glasses in statistical mechanics. Each neuron is a binary unit (±1). The synaptic weights between every pair of neurons are set by Hebb's rule: “neurons that fire together, wire together.” For p stored patterns, the weight matrix is the sum of outer products.
The energy function E = −½Σwᵢⱼsᵢsⱼ is the Lyapunov function for asynchronous updates: flipping any neuron to agree with its input never increases E. The stored patterns are local minima (attractors) in this landscape. A corrupted input probe finds its way to the nearest attractor, effectively completing or denoising the memory.
The network capacity is roughly 0.138N patterns for an N-neuron network before spurious attractors dominate. This lab uses a 20×10 = 200-neuron network, giving theoretical capacity ~27 patterns.