Scatter data in 2D/3D — watch principal components found via power iteration. Click canvas to add points.
PCA finds orthogonal directions of maximum variance. Power iteration: start with random vector, repeatedly multiply by the covariance matrix and normalize — converges to the dominant eigenvector. Repeat after deflation to get successive components.
PCA (principal component analysis) finds the axis of greatest variance by solving the covariance matrix eigenvalue problem. Power iteration converges geometrically to the dominant eigenvector. The first k components form the optimal low-dimensional linear subspace (Eckart-Young theorem).