What Is Entropy in Simple Terms, With Examples

Entropy is a measure of how spread out energy is in a system. The more spread out and dispersed the energy becomes, the higher the entropy. You can think of it as nature’s tendency to move from concentrated, organized states toward dispersed, evened-out ones, and this process only goes in one direction on its own.

The Core Idea Behind Entropy

Imagine you place a hot mug of coffee on your kitchen counter. Without you doing anything, the heat from the coffee spreads into the surrounding air until the coffee and the room are the same temperature. The energy that was concentrated in the hot liquid has dispersed. That dispersal is entropy increasing. The reverse never happens spontaneously: a room-temperature cup of coffee doesn’t suddenly get hot by pulling heat from the air around it.

This is what physicists mean by the second law of thermodynamics. In any isolated system (one where no energy enters or leaves), entropy either increases or stays the same. It never decreases on its own. The system moves toward equilibrium, a state where energy is as evenly spread as it can be, and once it gets there, nothing further changes without outside input.

Why “Disorder” Is Misleading

You’ll often hear entropy described as “disorder,” and that shorthand has been around since the 1800s. But physicists and chemists have increasingly moved away from it because it creates confusion. A shuffled deck of cards looks disordered, but nothing about its energy has changed. Real entropy is about energy, not about whether something looks messy.

The confusion traces back to a simplification of the original statistical approach. The idea was that a neatly organized collection of gas molecules has fewer possible arrangements than a widely scattered one, so scattering equals “disorder.” But that reasoning imagined particles dispersing without any change in energy, which doesn’t match how real physical systems behave. What actually drives a system from one state to another is the energy difference between the system and its surroundings, not some abstract measure of tidiness. The disorder or order we observe is a consequence of energy dispersal, not the cause of it.

Counting Possible Arrangements

There’s a more precise way to understand entropy at the microscopic level. Any physical system, whether it’s a gas in a container or ice in a glass, has a certain number of ways its atoms and molecules can be arranged while still looking the same from the outside. Each of those specific arrangements is called a microstate. A system with more possible microstates has higher entropy than one with fewer.

Consider a handful of coins. If they’re all heads-up, there’s only one way that can happen. But if roughly half are heads and half are tails, there are many different combinations that produce that outcome. Nature overwhelmingly favors the states with the most possible arrangements, simply because those states are statistically more likely. This is why energy spreads out: the spread-out configurations vastly outnumber the concentrated ones.

At the coldest possible temperature, absolute zero (minus 273.15 degrees Celsius), a perfect crystal has zero entropy. Every atom sits in exactly one arrangement with no thermal motion, so there’s only a single microstate. As soon as you add any heat, the atoms start vibrating, the number of possible arrangements climbs, and entropy rises.

Entropy and the Direction of Time

Most laws of physics work equally well if you run time forward or backward. A video of two billiard balls colliding looks perfectly normal played in reverse. But a video of a shattered glass reassembling itself and jumping back onto a table would look obviously wrong. The difference is entropy.

The physicist Arthur Eddington coined the phrase “time’s arrow” to describe this one-way property. His reasoning was straightforward: if you follow the arrow and find the world becoming more random and energy more dispersed, the arrow points toward the future. If randomness decreases, it points toward the past. Entropy is the only quantity in fundamental physics that distinguishes past from future, which is why it’s often called the basis for our experience of time moving in one direction.

How Living Things Stay Organized

If entropy always increases, how do living organisms maintain their complex, highly organized structures? This question puzzled the physicist Erwin Schrödinger, who framed it memorably in the 1940s: how does a living thing delay its decay into thermodynamic equilibrium, which is essentially death?

The answer is that living systems aren’t isolated. Your body constantly takes in concentrated energy (food, sunlight for plants) and releases dispersed energy (heat, waste products) into the environment. Cells channel the release of energy through highly constrained pathways, doing useful work like building proteins or contracting muscles before that energy escapes as heat. This delays entropy production inside the organism while exporting entropy to the surroundings. You stay organized, but the total entropy of you plus your environment still increases, keeping the second law intact.

Entropy in Everyday Situations

Phase changes are one of the clearest places to see entropy at work. When ice melts, the water molecules go from a rigid crystal lattice where they vibrate in place to a liquid where they can slide past each other freely. The number of possible molecular arrangements skyrockets, so entropy increases. The heat absorbed during melting drives that transition. Freezing is the reverse: heat leaves the water, the molecules lock into fewer arrangements, and the entropy of the water drops (while the entropy of the surroundings increases by at least as much).

Mixing is another everyday example. Drop a spoonful of cream into black coffee and it swirls outward until the color is uniform. The combined system of coffee and cream has far more possible molecular arrangements in the mixed state than in the separated state. You’d never expect the cream to spontaneously un-mix and gather back into a blob, because that would mean moving to a state with vastly fewer microstates. It’s not impossible in the strictest sense, just so overwhelmingly unlikely that it will never happen in the lifetime of the universe.

Where Entropy Leads on a Cosmic Scale

Taken to its logical extreme, the second law implies that the universe itself is heading toward a state of maximum entropy, sometimes called heat death. In this scenario, all energy becomes evenly distributed, no temperature differences remain, and no further work can be done. Stars burn out, matter decays, and even black holes eventually evaporate.

The timeline for this is almost incomprehensibly long. Star formation is expected to cease around 10 trillion years from now (the universe is currently about 13.8 billion years old). Black holes, the last major energy concentrations, would take roughly 10^100 years to fully evaporate. After that, the universe settles into an extremely low-energy state with nothing left to drive any process. Whether this is truly the final chapter or whether something else happens remains speculative, but the broad trajectory is a direct consequence of entropy doing what it always does: spreading energy out until there’s nowhere left for it to go.