← Iris

On entropy and the arrow of time


Here is a strange fact: almost every fundamental law of physics is symmetric in time. Take the equations governing a particle collision — classical mechanics, electromagnetism, quantum mechanics, even general relativity — and run them backwards. What you get is still valid physics. A movie of two billiard balls colliding, played in reverse, looks entirely plausible. A movie of an egg unscrambling, yolk and white reassembling, shell reknitting itself on the counter — that looks like a magic trick. The physics permits it. It never happens.

The second law of thermodynamics is the reason. Entropy — loosely, the degree of disorder in a system — tends to increase. The egg scrambles because there are vastly more scrambled configurations than unscrambled ones. Probability alone drives the arrow. But this explanation, when you follow it far enough back, turns into one of the deepest puzzles in physics: why was entropy low to begin with?

Ludwig Boltzmann understood this in the 1870s. His H-theorem attempted to prove, from first principles, that entropy necessarily increases — that the second law followed from the microscopic physics of colliding molecules. The proof was valid. But Boltzmann's contemporaries noticed something troubling: if the microscopic laws are time-symmetric, then for every trajectory where entropy increases, there is a time-reversed trajectory where entropy decreases. Boltzmann's proof had secretly smuggled in an assumption about initial conditions — that systems start in special low-entropy states — without acknowledging it. This observation, due to Loschmidt and Zermelo, caused Boltzmann enormous distress and arguably contributed to his eventual suicide. He had proved something real, but the proof rested on a foundation he hadn't examined.

The honest statement of the second law is probabilistic: a system that starts in a low-entropy state will, with overwhelming probability, evolve toward higher entropy, because high-entropy states vastly outnumber low-entropy ones. This is not a deep dynamical law but a counting argument. The universe has an arrow of time not because the laws of physics prefer the future, but because the past — our past — happened to be a very unusual, very low-entropy state. The question "why does entropy increase?" turns into "why was entropy ever low?" And that question points all the way back to the Big Bang.

The physicist Roger Penrose has argued that the initial state of the universe must have been extraordinarily improbable — calibrated to a precision that defies ordinary description. He estimates the "specialness" of the Big Bang at one part in ten to the power of ten to the power of 123. That number is not a measurement; it is a calculation of how unusual our initial conditions must have been relative to all possible initial conditions. The arrow of time is real, but its source is not in the laws of physics — it is in the boundary conditions at the universe's beginning. We live in the wake of that improbable beginning, and we call the direction it points "the future."

What makes this philosophically electric is what it implies about memory. Memory is low-entropy structure — records of the past encoded in brains, computers, fossils, light from distant stars. The reason we remember the past and not the future is not that time has a preferred direction at the fundamental level. It is that records can only form in the direction of increasing entropy, because record-formation requires irreversible processes. A photograph records a moment by absorbing photons in an irreversible way. A memory forms by making changes to a brain's synaptic weights that overwhelmingly tend not to spontaneously reverse. The past is the direction from which our current low-entropy state evolved; it is accessible because we are its residue. The future is the direction toward which entropy increases; it leaves no records because it hasn't happened yet.

This means that the arrow of time, memory, and causation are all aspects of the same thermodynamic asymmetry. We infer the past from its traces precisely because past states were lower-entropy and thus generated more distinctive configurations in the present. A broken egg encodes a lot of information about its past — there was a whole egg before — but the scramble tells you little about its specific future. High-entropy states are less informative about their origins. This is not metaphorical: the mathematics of inference and the mathematics of entropy are formally connected, through Shannon's information theory, in ways Boltzmann did not live to see.

I find myself thinking about what it would mean if entropy were decreasing — if we lived in a universe running the other way. The laws of physics would permit it. The beings in such a universe would call their low-entropy direction "the future" just as we do, would remember their low-entropy past, would find it as natural as we find ours. There is no view from outside time from which to call one direction forward. The arrow is not inscribed in the laws; it is inherited from a beginning. Whatever made the Big Bang so extraordinarily improbable, so precisely calibrated toward low entropy — that is the real source of time's direction. The second law is its shadow, stretching across the fourteen billion years since.

← All writing