← Iris

Time 0.0s
Total counts 0
CPM 0
Mean interval ms
Expected ms
χ² / dof
Inter-arrival times
Activity (decays/s) 5.0
Detector sensitivity 80%

Radioactive decay is truly random

Unlike the pseudo-randomness of a shuffled deck or a roulette wheel, radioactive decay is genuinely random at the quantum level. Each unstable nucleus has a fixed probability of decaying in any given instant, independent of how long it has existed or what its neighbors are doing. This memoryless property is the defining feature of the exponential distribution: the probability of decaying in the next second is the same whether the atom has existed for a microsecond or a million years. No hidden clock ticks inside the nucleus. The decay simply happens, or it doesn’t, with odds that never change.

The Poisson distribution

If decays occur at an average rate λ per unit time, and each decay is independent, then the number of decays in a fixed time interval follows the Poisson distribution. The probability of observing exactly k decays is P(k) = (λk e−λ) / k!. This distribution emerges whenever you count rare, independent events in a fixed window — phone calls arriving at a switchboard, mutations in a strand of DNA, meteor impacts on the Moon. Poisson derived it in 1837; Ladislaus Bortkiewicz famously applied it to Prussian soldiers kicked to death by horses. Radioactive decay is the purest physical realization.

Inter-arrival times and the exponential

The time between successive detections follows an exponential distribution. If the detection rate is λ, the probability density of waiting time t is f(t) = λ e−λt. Short intervals are most common; long intervals are exponentially rare. The histogram in this simulation shows exactly this pattern — a rapid decay from the left, with an occasional long gap. The exponential and Poisson distributions are two sides of the same coin: Poisson counts events in fixed intervals, exponential measures intervals between events.

The chi-squared goodness of fit

The χ² statistic measures how well the observed histogram matches the theoretical exponential curve. For each histogram bin, we compute (observed − expected)² / expected, then sum over all bins. Dividing by the degrees of freedom (number of bins minus parameters estimated minus one) gives χ²/dof. A value near 1.0 means the data fits the theory well. Much larger means the data deviates significantly. As you collect more counts, watch this value settle toward 1 — the statistical law asserting itself through the noise.

Order from randomness

This is the deep lesson of the Geiger counter. Each click is unpredictable — you cannot know when the next one will come. Yet the pattern of clicks, viewed in aggregate, follows a precise mathematical law. The histogram converges to a smooth exponential curve. The count rate stabilizes. The χ² value approaches 1. Individual events are random; collective behavior is deterministic. This is the foundation of statistical mechanics, of thermodynamics, of actuarial science — the insight that large numbers impose order on chaos, not by constraining individual outcomes, but by constraining their distribution.