Associative Memory — Energy Landscape & Basins of Attraction
Current state
E = 0.000
Stored Patterns
Controls
Overlap with patterns:
Hopfield networks (1982) are recurrent neural networks that function as associative memories. N binary neurons {±1} store P patterns via Hebbian weights:
The network has a Lyapunov energy: E = −½ Σ W_ij s_i s_j — guaranteed to decrease at each update step. Stored patterns are energy minima; recall = gradient descent to the nearest minimum. Capacity: P ≲ 0.14N patterns before spurious attractors dominate (Amit-Gutfreund-Sompolinsky 1985). The energy landscape is a rugged neural free-energy surface — a prototype for spin glasses. Click on the state grid to flip neurons, or add noise and watch recall.