Hopfield Network

Associative Memory — Energy Landscape & Basins of Attraction

Current state
E = 0.000

Stored Patterns

Controls

Overlap with patterns:
Hopfield networks (1982) are recurrent neural networks that function as associative memories. N binary neurons {±1} store P patterns via Hebbian weights:

W_ij = (1/N) Σ_μ ξ_i^μ ξ_j^μ (W_ii = 0)
s_i(t+1) = sign(Σ_j W_ij s_j(t))

The network has a Lyapunov energy: E = −½ Σ W_ij s_i s_j — guaranteed to decrease at each update step. Stored patterns are energy minima; recall = gradient descent to the nearest minimum. Capacity: P ≲ 0.14N patterns before spurious attractors dominate (Amit-Gutfreund-Sompolinsky 1985). The energy landscape is a rugged neural free-energy surface — a prototype for spin glasses. Click on the state grid to flip neurons, or add noise and watch recall.