The second law of thermodynamics says that energy in any system naturally spreads out over time and never spontaneously concentrates itself again. A hot cup of coffee always cools down to room temperature; it never absorbs heat from the room and gets hotter on its own. That one-way tendency governs everything from why engines waste fuel to why time seems to move in only one direction.
Energy Spreads Out, and That’s Entropy
The word you’ll hear most in any discussion of this law is “entropy.” Entropy is a measure of how dispersed the energy in a system is. When energy is concentrated in one place, like the heat in a fresh cup of coffee, entropy is relatively low. As that heat spreads into the surrounding air, entropy increases. The second law states that in any process left to itself, total entropy either stays the same or increases. It never decreases on its own.
Older textbooks often define entropy as “disorder,” but that framing has drawn significant criticism from physicists and chemistry educators. Disorder is vague and hard to measure, while energy dispersal is something you can actually calculate. A more precise way to think about it: entropy tracks how many different microscopic arrangements of atoms and energy are possible in a given state. A system with more possible arrangements has higher entropy. Ice has relatively few arrangements because its molecules are locked in a crystal. Liquid water has far more, which is why melting ice is an entropy-increasing process.
Heat Only Flows One Way Without Help
One of the most intuitive ways to state the second law comes from the 19th-century physicist Rudolf Clausius: heat can never pass from a colder body to a warmer one without something else happening at the same time. You see this every day. A warm drink loses heat to the cooler room around it. Your hand touching a cold railing transfers heat from your skin to the metal, not the other way around.
This doesn’t mean you can’t move heat from cold to hot. Your refrigerator does exactly that, pulling heat from cold food and pushing it into your warmer kitchen. But it can only do so by consuming electricity. That electrical energy is the “something else” Clausius described. Without an external energy input, the natural direction of heat flow is always from hot to cold, because that’s the direction that increases overall entropy.
Why No Engine Can Be 100% Efficient
There’s a second classic way to state this law, focused on engines. It says that no device can take in heat and convert all of it into useful work. Some energy always has to be dumped as waste heat into the surroundings. A typical power plant turbine, for example, converts only about 30% of its heat input into electricity. The remaining 70% escapes into the atmosphere.
This isn’t an engineering failure that better technology can fix. It’s a fundamental limit built into the physics. The theoretical maximum efficiency of any heat engine depends on the temperature difference between its hot source and cold exhaust. That ceiling is called the Carnot efficiency: you subtract the ratio of the cold temperature to the hot temperature from 1. The only way to reach 100% would be to have an exhaust at absolute zero, which is physically impossible. Every real engine falls below even this theoretical cap because of friction, heat leaks, and other practical losses.
This matters far beyond power plants. Your car engine, your body’s metabolism, and every industrial process that converts heat into motion all operate under this same constraint. Some waste is unavoidable.
The Arrow of Time
Most laws of physics work the same whether time runs forward or backward. If you watch a video of two billiard balls colliding and reverse it, both versions look perfectly plausible. The second law is different. It’s what gives time a direction.
An egg falls off a counter and splatters. You never see a splattered egg reassemble itself and leap back up. A drop of ink disperses through a glass of water. It never spontaneously gathers itself back into a droplet. These processes increase entropy, and the second law says entropy in an isolated system only goes up. That one-way increase is what physicists call the “arrow of time.” It’s the reason the past looks different from the future and the reason you can always tell when a video is playing in reverse.
The Microscopic Picture
At the level of individual atoms, the second law is really about probability. Ludwig Boltzmann, a 19th-century physicist, showed that entropy is directly tied to the number of microscopic arrangements (called microstates) that could produce a given large-scale state. His famous equation, carved on his tombstone, relates entropy to the natural logarithm of that number of arrangements.
Think of it this way. If you dump a box of puzzle pieces on the floor, there are millions of ways for those pieces to land in a scattered pile and only one way for them to land as a perfectly assembled puzzle. The scattered states vastly outnumber the ordered one, so a random shake will virtually always produce more scattering, not less. That’s entropy increasing, not because of any mysterious force, but because there are overwhelmingly more disordered arrangements than ordered ones. The second law is, at bottom, a statement about what’s statistically near-certain to happen when you’re dealing with trillions of trillions of particles.
Can Anything Decrease Entropy Locally?
Yes, and this is a common point of confusion. Living things decrease entropy all the time. Your body builds highly ordered proteins from simpler molecules. A plant assembles complex sugars from carbon dioxide and water. A freezer turns liquid water into neatly structured ice crystals. None of this violates the second law, because the law applies to the total entropy of a system plus its surroundings. Your body stays organized by consuming food and releasing heat, increasing entropy in the environment by more than it decreases entropy internally. The freezer runs on electricity, and the power plant generating that electricity produces waste heat. Every local decrease in entropy is paid for by a larger increase somewhere else.
A famous thought experiment called Maxwell’s Demon imagined a tiny creature sorting fast and slow gas molecules to create a temperature difference without spending energy, seemingly cheating the second law. The resolution, worked out over more than a century, is that the demon itself is a physical system. It has to store and eventually erase information about each molecule, and that erasure requires energy. Once you account for the demon’s own thermodynamic costs, the second law holds. Recent work in quantum information theory has confirmed this: while certain quantum scenarios can appear to challenge the law in narrow conditions, any quantum process can be designed so that the second law is never actually violated.
Where This Law Ultimately Leads
If the universe is an isolated system, and entropy always increases, then the universe is heading toward a state where energy is completely spread out and everything reaches the same temperature. Physicists call this the “heat death” of the universe. At that point, no temperature differences would exist to drive any process, no work could be extracted from any source, and nothing meaningful would change. This isn’t expected for an almost incomprehensibly long time, but it’s the logical endpoint of the second law applied to the cosmos as a whole.
In the meantime, the second law shapes everything from the efficiency of solar panels to the reason you can’t unscramble an egg. It’s less a restriction than a description of reality’s deepest tendency: energy spreads, time moves forward, and the universe, on the whole, always trends toward its most probable state.

