Message passing (belief propagation) solves probabilistic inference on graphical models by passing local messages between variable and factor nodes. On a factor graph, variable-to-factor messages μ_{x→f}(x) and factor-to-variable messages μ_{f→x}(x) are updated iteratively until convergence. The algorithm is exact on trees (loopy BP is an approximation on graphs with cycles). Applications include LDPC decoding, SAT solving, and the cavity method in spin glasses. Here you see belief vectors on each variable node converging iteration by iteration, colored by their marginal probability.