Given known constraints (mean, variance, support), the least-biased probability distribution is the one that maximizes Shannon entropy. This is the Maximum Entropy (MaxEnt) principle.
The Gaussian maximizes entropy given fixed mean and variance. The exponential distribution maximizes entropy given fixed mean over [0,∞). The uniform distribution maximizes entropy over bounded support. Any deviation from MaxEnt implies additional information — otherwise it is unwarranted.