Hopfield Network Memory

Associative memory via energy minimization
Stored Patterns
Noisy Input
(click to flip bits)
Network Recall
Energy
A Hopfield network stores patterns as fixed points of an energy landscape. Weights are set by Hebbian learning: w_ij = Σ_μ ξ^μ_i · ξ^μ_j / N. Given a noisy or partial pattern, the network evolves asynchronously — each neuron flips to minimize total energy E = -½ΣΣ w_ij s_i s_j. The dynamics descend the energy gradient, converging to the nearest stored memory. Capacity limit: ~0.14N patterns for perfect recall.