Bayesian Neural Network

Weight uncertainty · variational inference · epistemic vs aleatoric · MC dropout

Predictions + uncertainty (click to add data)
Weight posterior distributions
BNNs place distributions over weights: w ~ N(μ, σ²) instead of point estimates. Uncertainty = epistemic (model uncertainty, reducible with more data) + aleatoric (data noise, irreducible). Variational inference: approximate posterior q(w|θ) ≈ p(w|D) by minimizing KL divergence. MC Dropout: keeping dropout active at test time approximates Bayesian inference. Uncertainty widens outside training data (extrapolation).