PCA — Geometry of Covariance Decomposition
Principal Component Analysis finds the axes of maximum variance in data. The covariance matrix Σ = (1/N)XᵀX is real symmetric, so its eigendecomposition Σ = UΛUᵀ gives orthogonal eigenvectors (principal components) and eigenvalues λ₁ ≥ λ₂ ≥ … equal to the variances along each PC direction.
The ellipse shown is the 2σ confidence ellipse of the fitted Gaussian, with semi-axes proportional to √λ₁ and √λ₂ aligned to the eigenvectors. Projecting data onto PC1 maximizes explained variance; the residuals are captured by remaining PCs. Drag controls to watch eigenvectors rotate in real time.
PCACovariance matrixEigendecompositionVariance explainedData geometry