Gaussian Mixture EM

Expectation-maximization fitting animated step by step

Settings

EM Progress

EM step:0
Log-likelihood:
ΔLL:
Converged:No

About

E-step: compute soft assignments r_ik = π_k N(x_i|μ_k,Σ_k) / Σ_j ... M-step: update means μ_k = Σr_ik x_i / Σr_ik, covariances, and mixing weights π_k. Log-likelihood monotonically increases (Jensen's inequality). Convergence when ΔLL < 1e-4.