Probability Distribution Explorer
Seven distribution families with their PDF/PMF and CDF shown side by side. Drag parameters to reshape the curve, draw random samples to overlay a histogram on the theory, and watch key statistics update live. The dual-panel view reveals how the density function and its cumulative integral relate.
| Mean | Variance | Skewness | Kurtosis | |
|---|---|---|---|---|
| Theoretical | — | — | — | — |
| Sample | — | — | — | — |
PDF and CDF
The probability density function (PDF) describes the relative likelihood of each value. For continuous distributions, the probability of falling in an interval is the area under the PDF over that interval. The cumulative distribution function (CDF) is the running integral of the PDF: F(x) = P(X ≤ x). It climbs from 0 to 1 and tells you, for any threshold, what fraction of the distribution lies below it.
Sampling and convergence
Click Sample to draw random values and overlay a histogram on the theoretical PDF. With few samples, the histogram is rough. Increase N and it converges to the theoretical curve — the law of large numbers in action. The CDF panel shows the empirical CDF (step function) converging to the theoretical curve.
Why seven distributions?
Normal — the universal attractor of averages (Central Limit Theorem). Uniform — maximum ignorance over a bounded interval. Exponential — waiting times between memoryless events. Poisson — counts of rare independent events. Binomial — number of successes in fixed trials. Beta — distributions over probabilities, the Bayesian conjugate prior. Gamma — sums of exponentials, generalizing the chi-squared and Erlang distributions.