Hebbian Learning & Synaptic Plasticity (Oja Rule)

Neurons that fire together wire together — PCA via unsupervised learning
Oja rule: Δw = α·y·(x − y·w)  |  converges to 1st principal component  |  ‖w‖ = 1

Controls

0.71
w₁
0.71
w₂
0.0°
Angle to PC1
0
Steps
Hebb's rule (1949): "Cells that fire together, wire together." Mathematically: Δwᵢ = α·xᵢ·y where y=w·x. Pure Hebb leads to unbounded growth.

Oja's rule (1982): Adds normalizing term Δw = α·y·(x − y·w). This self-stabilizing rule converges to the first principal eigenvector of the input covariance matrix — implementing PCA without explicit matrix computation. The weight norm converges to 1 automatically.

BCM rule: Bienenstock-Cooper-Munro adds a sliding threshold θ_M: Δw = φ(y,θ_M)·x where φ changes sign at θ_M. This explains orientation selectivity in V1 cortex and Hebbian LTP/LTD transitions.