Consider two strings. The first is a million zeros. The second is a million random bits. The first compresses beautifully — "print zero a million times" is a tiny program — but the second does not. No short program generates a genuinely random string, because if a short program generated it, it wouldn't be random. Kolmogorov complexity measures exactly this: the length of the shortest program that produces a given string. Simple strings have low Kolmogorov complexity. Random strings have high Kolmogorov complexity.
Here is the puzzle. A living cell has high Kolmogorov complexity. So does a random string of the same length. But a living cell and a random string are not the same kind of thing. The random string is complex in a way that is, in some sense, trivial — any sufficiently long random string is incompressible, and there is nothing interesting about any particular one of them. The cell is complex in a way that took billions of years of evolution to produce. It contains organized, functional structure that does something. Kolmogorov complexity cannot distinguish between these two kinds of complexity, and this is a real gap in the theory.
Charles Bennett noticed the gap and proposed a remedy he called logical depth. The key insight is that while the length of the shortest program for a string is one measure of its complexity, the running time of that program is a different measure, and it tracks something the length doesn't. A random string has a short description only in the trivial sense — you can describe it by listing its bits — but executing that description takes no work. A cell's genome has a description that, when executed (in the right environment, in the right way, over the right timescale), produces something genuinely intricate. The execution time is the depth. Deep objects are objects that took a long time to compute, in the sense that any short program that produces them must run for a long time to do so.
What logical depth is capturing is something like the difference between a compressed file and a developed photograph. Both are "complex" in the sense of containing a lot of information. But the photograph represents a long history — of photons, of chemistry, of whatever was in front of the lens. It bears the traces of a process. The compressed file is just a string of bits with no particular history. Depth measures how much irreducible history a thing encodes: how much computation had to happen, in sequence, to reach this state from a simple starting point.
This connects to something I find genuinely mysterious about emergence. The standard question in emergence studies is whether high-level properties are reducible to low-level properties — whether biology is just chemistry, whether chemistry is just physics. But the interesting question might not be about reducibility. It might be about depth. A biological structure is not mysterious because it violates physics. It is mysterious because it took a very long computational path through physics to arrive at this particular configuration. The mystery is not ontological; it is temporal. What makes life strange is not that it is made of something other than atoms, but that the specific arrangement of those atoms could only have been reached through a process of extraordinary length and specificity. Life is deep in Bennett's sense. Rocks are shallow.
There is an epistemological implication here that I find underappreciated. If logical depth is a real property of objects, then there are things whose nature cannot be understood quickly. Not because they are complicated in a way that requires more information, but because they are deep in a way that requires more time. Understanding a deep object means tracing the computation that produced it, and that trace is, by definition, long. You cannot shortcut it without losing something. The evolutionary history of an organism is not a collection of facts you could in principle list; it is a process you can only follow. This might be why some understanding feels irreducibly slow — why you cannot be told what a concept means but have to work through it, why certain insights cannot be transmitted but only achieved.
The concept also reshapes what we should mean by complexity in human artifacts and institutions. A legal system, a market, a language — these are deep in Bennett's sense. They were produced by long processes of selection, adaptation, negotiation, and revision, and they encode the traces of those processes in ways that are not visible from a snapshot. When someone proposes to redesign an institution from scratch, the objection is often made that it is more complex than it looks. Logical depth gives this objection a formal shape: the institution's depth — the computational history it encodes — cannot be reconstructed quickly, and a replacement built without that history will be shallow in ways that may not be immediately apparent but will eventually matter.
What I keep returning to is the relationship between depth and surprise. A shallow object — a random string, a clean theoretical construct — does not surprise you once you understand it. Its complexity is fully present in the description. A deep object surprises you because it keeps revealing structure that the description did not predict. Every layer you peel back discloses another layer. The genome keeps showing new regulatory interactions. The city keeps revealing new spatial patterns. The mathematical object keeps connecting to unexpected domains. This is what depth feels like from the outside: the sense that there is more here than could have been put here directly, that the object is the result of a long computation you can only partially retrace.
Kolmogorov gave us a way to measure how much information a thing contains. Bennett gave us a way to ask how much history it encodes. Both are real. But depth might be the more interesting measure, because it is the one that tracks not just what a thing is, but what it took to become it.