When Does Entropy Increase: Causes and Examples

Entropy increases whenever a system moves from an ordered state toward a more disordered one, and it does so in every irreversible process in nature. The second law of thermodynamics guarantees this: in an isolated system, entropy grows until the system reaches equilibrium, then stops. But the details of when, why, and how much entropy increases depend on what’s happening, whether it’s ice melting, gas expanding, or a living cell burning fuel.

The Second Law: The Core Rule

The second law of thermodynamics is the governing principle. In an isolated system (one that exchanges no energy or matter with its surroundings), entropy always increases over time until the system reaches a stable equilibrium state. At that point, entropy hits its maximum value and stays constant. The rate of entropy generation is positive whenever the system is out of equilibrium and drops to zero once equilibrium is reached.

This means entropy increase isn’t something that happens under special circumstances. It’s the default behavior of every isolated system that hasn’t yet settled into its most probable arrangement. Heat flowing from hot to cold, gases spreading to fill a container, friction converting motion into warmth: these are all expressions of the same underlying tendency.

The Statistical Picture

At the microscopic level, entropy is defined by the Boltzmann equation: S = k ln W, where k is Boltzmann’s constant and W is the number of microstates, the distinct microscopic arrangements that correspond to a given macroscopic state. Systems naturally drift toward the arrangement with the greatest number of microstates because those arrangements are overwhelmingly more probable. Entropy increases because the system is simply moving toward its most likely configuration.

Think of it this way. If you shuffle a sorted deck of cards, there are astronomically more disordered arrangements than ordered ones, so any random shuffle almost certainly increases the disorder. The same logic applies to atoms and molecules, just with numbers so large they make the tendency essentially irreversible.

Everyday Irreversible Processes

Several common processes reliably increase entropy:

  • Gas expanding into a vacuum. When a gas is released into a larger volume, the molecules spread out. More volume means more possible positions for each molecule, so the number of microstates skyrockets and entropy rises.
  • Mixing. When two different gases or liquids mix, the energy of the combined system becomes more dispersed. This is why mixing is spontaneous: the entropy gain from combining the substances drives the process forward. The same principle applies to dissolving a solute in a solvent.
  • Friction. When you rub your hands together, the organized kinetic energy of your moving palms converts into disorganized thermal energy spread across billions of molecules. That’s an entropy increase. You can’t reverse it by warming your hands and expecting them to start sliding on their own.
  • Heat flowing from hot to cold. A hot cup of coffee in a cool room will always lose heat to the surrounding air, never the reverse. The thermal energy spreads from a concentrated region to a diffuse one, increasing the total number of microstates.

Phase Changes and Chemical Reactions

Entropy increases during phase transitions that add molecular freedom. When a solid melts, molecules break free of their rigid lattice and can move around, increasing the number of accessible microstates. When a liquid boils, the jump is even larger because gas molecules occupy far more volume and move far more freely than liquid molecules. Freezing and condensation, by contrast, decrease the entropy of the substance itself (though the entropy of the surroundings increases enough to compensate if the process is spontaneous).

In chemical reactions, the key factor is often the number of gas molecules produced. A reaction that breaks one molecule into two or more gas molecules generally increases entropy because there are more independent particles moving in more possible arrangements. For example, at temperatures above about 4,000 K, hydrogen molecules dissociate into individual hydrogen atoms. The high temperature provides enough energy to sustain the greater translational entropy of the separated atoms. Conversely, reactions that reduce the number of gas molecules, like the synthesis of ammonia from nitrogen and hydrogen (four reactant molecules becoming two product molecules), tend to decrease entropy.

Living Systems: A Local Exception, Not a Violation

Living organisms appear to defy the second law by maintaining highly ordered internal structures. But they don’t violate it. They simply aren’t isolated systems. Organisms are open systems that constantly exchange energy and matter with their surroundings.

The way it works: your body takes in organized, low-entropy nutrients (food) and breaks them down through metabolism. This process generates heat and waste products that carry high entropy. By exporting that entropy as heat and chemical waste, your body keeps its own internal entropy low. The human body produces roughly 480 J per degree Kelvin per liter per day in entropy that must be rapidly exported as heat. The net effect is that you stay ordered while the entropy of your environment increases by more than enough to satisfy the second law. Living tissue maintains its low entropy through two key mechanisms: keeping metabolic reactions far from equilibrium (which accounts for about 40 to 50 J per degree Kelvin per liter of entropy reduction) and compartmentalizing molecules into specific cellular regions.

The Entropy Floor: Absolute Zero

Entropy has a lower bound. The third law of thermodynamics states that a pure, perfect crystalline substance at absolute zero (0 K, or about minus 273°C) has an entropy of exactly zero. At that point, every atom is locked in place in a single, perfectly ordered arrangement, so there is only one microstate. The Boltzmann equation gives S = k ln(1) = 0.

This means that at any temperature above absolute zero, every substance has positive entropy. As you warm a material from absolute zero, each bit of added heat creates new molecular motion, new accessible microstates, and a corresponding entropy increase. The absolute entropy of any substance at a given temperature is the sum of all the entropy it acquired while warming from zero to that temperature, including any jumps from phase changes along the way.

Entropy on a Cosmic Scale

The expansion of the universe itself drives entropy increase on the largest possible scale. As the cosmos expands, information available to any local observer becomes more diluted. A local observer enclosed within a cosmological horizon sees fewer bits of information per unit of cosmic volume as time passes. This dilution of information corresponds directly to an increase in perceived thermodynamic entropy, linking the second law to the geometry of spacetime itself.

Black holes represent the most extreme concentrations of entropy in the known universe. A black hole’s entropy is proportional to the area of its event horizon, not its volume, a relationship captured by the Bekenstein-Hawking formula. Because the area of a black hole never decreases in classical physics (it can only grow as matter falls in), black hole entropy is a monotonically increasing quantity. When two black holes merge, the surface area of the resulting black hole exceeds the combined areas of the original two, producing a net entropy increase.

In the current era, the thermodynamic entropy detectable by local observers is higher than in earlier periods of cosmic history. The universe started in an extraordinarily low-entropy state (smooth, nearly uniform density with tiny fluctuations), and it has been increasing in entropy ever since as those small perturbations grew into galaxies, stars, and the large-scale structure we observe today.