What Does Entropy Measure? More Than Just Disorder

Entropy measures how spread out energy is within a system. The more dispersed the energy, the higher the entropy. This single idea connects thermodynamics, chemistry, information science, and even cosmology, though it shows up in slightly different forms in each field.

Energy Dispersal, Not “Disorder”

You’ve probably heard entropy described as a measure of disorder. That definition is everywhere in textbooks, but it’s misleading. A 2016 analysis in Communicative & Integrative Biology traced this confusion back to Ludwig Boltzmann, who imagined an idealized gas that could spread out without any change in energy. That thought experiment worked mathematically but detached entropy from what actually drives physical change: the flow and dispersal of energy. The “disorder” framing makes it sound like entropy is about messiness, when it’s really about how energy distributes itself among the available places it can go.

Think of a hot cup of coffee in a cool room. The heat energy is concentrated in the coffee. Over time, that energy spreads into the surrounding air until the coffee and room reach the same temperature. The total energy hasn’t changed, but it’s now dispersed across a much larger space. That dispersal is what entropy tracks. The coffee cooling down isn’t becoming “disordered.” It’s reaching a state where its energy is more evenly spread out.

The Thermodynamic Definition

In classical thermodynamics, the change in entropy is defined as the heat transferred in a reversible process divided by the absolute temperature at which that transfer happens. When ice melts, for example, the entropy change equals the energy absorbed during melting divided by the temperature. A process that absorbs more heat, or occurs at a lower temperature, produces a larger entropy change.

This is why phase changes are such clear illustrations. Ice absorbing heat at 0°C gains entropy because its molecules shift from a rigid crystal into a liquid where energy is distributed across many more possible motions. Boiling water into steam is an even larger entropy increase, since gas molecules can move freely in all directions, spreading energy across far more configurations than a liquid allows.

Counting Microstates

At the molecular level, entropy has a precise statistical meaning. Boltzmann’s famous equation, S = k ln(W), says that entropy (S) equals a constant (k) multiplied by the natural logarithm of W, the number of microstates available to a system. A microstate is one specific arrangement of all the particles and their energies. W counts how many such arrangements are consistent with what you observe at the large scale (temperature, pressure, volume).

A block of ice has relatively few microstates because its molecules are locked in place. Melt that ice, and the number of possible arrangements explodes. The molecules can move, rotate, and vibrate in many more ways, so W skyrockets and entropy increases. The Boltzmann constant (k) that connects the microscopic count to measurable entropy has a value of 1.380649 × 10⁻²³ joules per kelvin, an extraordinarily small number reflecting the enormous number of molecules involved in everyday systems.

Entropy in Information Theory

In 1948, Claude Shannon borrowed the concept for an entirely different purpose: measuring uncertainty in a message. Shannon entropy quantifies how much “surprise” a source of information contains. If you know exactly what someone is about to say, there’s no uncertainty and the entropy is zero. If every possible message is equally likely, the entropy is at its maximum.

Shannon’s formula looks remarkably similar to Boltzmann’s. For a set of possible outcomes with probabilities p₁, p₂, and so on, the entropy H equals the negative sum of each probability multiplied by its logarithm. When the logarithm is base 2, the result is measured in bits. A fair coin flip has an entropy of 1 bit, meaning each flip gives you exactly one bit of information. A loaded coin that lands heads 99% of the time has much lower entropy, because the outcome is rarely surprising.

This isn’t just an analogy. Shannon himself noted that his formula has the same mathematical form as entropy in statistical mechanics. Both measure the number of ways something can turn out. In physics, it’s the number of ways energy can be arranged among particles. In information theory, it’s the number of ways a message can unfold.

Why Entropy Determines What Can Happen

The Second Law of Thermodynamics states that the total entropy of the universe never decreases. For any process that happens on its own, the combined entropy of the system and its surroundings must increase (or, in an idealized reversible process, stay exactly the same). This is why heat flows from hot to cold and never the reverse, why gas expands to fill a room, and why you can’t unscramble an egg.

In chemistry, entropy directly determines whether a reaction will happen spontaneously. The Gibbs free energy equation combines entropy change with energy change: G = H − TS, where H is the heat content, T is the temperature, and S is the entropy. A reaction is spontaneous when G is negative. This means a reaction that increases entropy can proceed even if it absorbs heat, as long as the temperature is high enough for the entropy term (TS) to outweigh the energy cost. That’s exactly why some substances dissolve in hot water but not cold.

A reaction that releases heat and increases entropy is spontaneous at every temperature. One that absorbs heat and decreases entropy will never be spontaneous on its own. The interesting cases fall in between, where temperature tips the balance.

Entropy and the Arrow of Time

Nearly all the fundamental laws of physics work the same whether time runs forward or backward. Gravity, electromagnetism, and quantum mechanics don’t have a built-in direction. Entropy is the exception. The physicist Arthur Eddington coined the phrase “time’s arrow” to describe this: if you follow the direction in which entropy increases, you’re moving toward the future.

This connection rests on the observation that the early universe started in an extraordinarily low-entropy state, with energy concentrated in dense, hot matter. Every process since then has been spreading that energy out. Stars radiate light into cold space. Hot cores cool. Structures eventually decay. The Second Law doesn’t explain why the universe began with low entropy, but given that starting point, the steady increase in entropy defines the direction we experience as “forward in time.”

Taken to its logical conclusion, entropy predicts a far-future state sometimes called the heat death of the universe. If the universe continues expanding and entropy keeps rising, energy will eventually be distributed so uniformly that no temperature differences remain. Without gradients to drive any process, no work can be done, no life can exist, and no meaningful change occurs. Estimates for this timescale are staggering. The decay time for a supermassive black hole alone is roughly 10¹⁰⁰ years, and heat death would follow only after even those remnants have dissipated.

What Entropy Actually Tells You

Across all its uses, entropy answers the same core question: how many ways can things be arranged? In a physical system, it counts the ways energy can spread among particles. In a message, it counts the ways a sequence of symbols can play out. A high-entropy state has many possible configurations. A low-entropy state has few. Nature overwhelmingly favors high-entropy states for the simple reason that there are vastly more of them, making them statistically almost certain.

That’s the deepest insight entropy offers. It’s not a force. It’s not a substance. It’s a count of possibilities, and the universe relentlessly moves toward the arrangements that are most probable.