Entropy increases whenever energy spreads out, matter becomes more dispersed, or a system moves from an ordered arrangement toward a more probable, disordered one. At the most fundamental level, any process that increases the number of possible configurations (called microstates) that particles can occupy will raise entropy. This happens constantly in nature, from ice melting in your drink to stars burning out over billions of years.
Heat Transfer
The most straightforward way to increase entropy is to add heat to a system. When thermal energy flows into a substance, its molecules move faster and occupy a wider range of energy states, creating more possible arrangements. The relationship is simple: the entropy gained equals the heat added divided by the temperature at which it’s absorbed. This means a given amount of heat raises entropy more at low temperatures than at high ones. Warming something that’s already very cold creates a bigger jump in disorder than adding the same energy to something already hot.
Heat also increases entropy whenever it flows spontaneously from a hot object to a cold one. The cold object gains more entropy than the hot object loses, so the total entropy of the two objects combined always goes up. This is why a cup of coffee cooling on your desk is a one-way process. The heat disperses into the surrounding air, and the universe becomes slightly more disordered as a result.
Phase Changes
Melting, boiling, and sublimation all increase entropy because they transform matter from more structured states into less structured ones. In a solid, molecules are locked in fixed positions. When ice melts, those molecules break free to slide past one another. When water boils, they scatter into a gas with vastly more freedom of movement.
The numbers reflect how dramatic these transitions are. Freezing 50 grams of water releases enough order to decrease entropy by about 61 J/K. Running the process in reverse (melting that ice) increases entropy by the same amount. Vaporization is even more significant: converting liquid benzene to vapor, for instance, produces an entropy increase of roughly 112 J/K per 100 grams, because gas molecules can spread across a much larger volume and move in far more ways than molecules in a liquid.
Mixing and Diffusion
When two different gases share a container, they spontaneously mix until they’re evenly distributed. This always increases entropy. Before mixing, each gas is confined to its own region. Afterward, every molecule has access to the entire volume, which multiplies the number of possible arrangements enormously. The entropy of mixing is always positive because each component’s share of the total mixture (its mole fraction) is less than one, and the mathematics guarantees a net gain in disorder.
The same logic applies to a drop of food coloring in water, sugar dissolving in coffee, or perfume spreading through a room. Molecules bounce off one another and off container walls, naturally filling any newly available space. They do this not because of any driving force pulling them outward, but because a spread-out arrangement is overwhelmingly more probable than a concentrated one.
Chemical Reactions That Produce More Gas
Chemical reactions can either increase or decrease entropy, and the key factor for gaseous reactions is whether the total number of gas molecules goes up or down. When a single molecule breaks apart into two or more gas molecules, the products have more ways to arrange themselves, so entropy rises. For example, when dinitrogen tetroxide splits into two molecules of nitrogen dioxide, the entropy change is positive because one particle has become two.
Conversely, reactions that combine gas molecules into fewer products decrease entropy. Burning carbon monoxide with oxygen to form carbon dioxide reduces the gas from three total moles of reactant to two moles of product, and the entropy change is negative. As a general rule: more gas molecules out than in means higher entropy, fewer means lower.
Irreversible Processes
Every real-world process is at least slightly irreversible, and irreversibility is what generates new entropy. Friction is the most familiar example. When you slide a book across a table, kinetic energy converts to heat through friction, warming the book and table surfaces. That thermal energy disperses into the surrounding environment and cannot be fully recovered to push the book back. The total entropy of the universe increases.
Other common irreversible processes include turbulence in flowing fluids, electrical resistance heating a wire, and the rapid expansion of a gas into a vacuum. In each case, useful, concentrated energy degrades into diffuse heat. The entropy produced by these internal dissipative processes, such as heat conduction, diffusion, and chemical reactions, accumulates and can never be undone without increasing entropy somewhere else even more.
The Microstate Explanation
At a deeper level, entropy is a count of possibilities. The Boltzmann equation, S = kB ln W, defines entropy (S) as proportional to the natural logarithm of the number of microstates (W) available to a system. A microstate is one specific arrangement of all the particles in a system: their positions, speeds, and energy levels. The Boltzmann constant (kB = 1.38 × 10⁻²³ J/K) converts this pure number into physical units.
Imagine four particles that can sit in two bins. Putting all four in one bin gives only one possible arrangement (W = 1). Splitting them evenly, two in each bin, gives six possible arrangements (W = 6). The even split has higher entropy simply because there are more ways to achieve it. Nature doesn’t “prefer” disorder for any mysterious reason. Systems evolve toward higher entropy because states with more microstates are overwhelmingly more likely to occur by random chance.
Anything that increases the number of accessible microstates, such as adding energy, expanding volume, breaking bonds, or mixing substances, increases entropy.
How Living Things Fit In
Living organisms appear to defy entropy by maintaining highly organized internal structures, but they actually increase the entropy of their surroundings by a wide margin. Organisms are open systems that take in food (low-entropy, energy-rich molecules), extract useful energy, and export heat and waste products (high-entropy outputs) to the environment.
An adult human body produces roughly 10 million joules of heat per day, carrying about 480 J/K per liter of body volume per day in exported entropy. That exported entropy is about 20 times larger than the small entropy deficit the body maintains internally through compartmentalized cells and non-equilibrium chemistry. In other words, your body stays organized by making the rest of the universe considerably more disordered. This heat must be dissipated quickly, too. Without it, proteins would break down and cell membranes would fall apart.
Entropy on a Cosmic Scale
Zoom out far enough and entropy applies to the entire universe. Every star that burns, every planet that radiates heat into space, and every black hole that forms adds to the universe’s total entropy. The concept of “heat death” describes a theoretical endpoint where entropy reaches its maximum: all energy is evenly distributed, no temperature differences remain, and no further work can be done. Everything has degraded into uniform, diffuse heat spread across an ever-expanding cosmos.
This doesn’t mean the universe will become hot. It means the opposite. As space expands and energy disperses, temperatures approach a uniform, barely-above-zero background. The defining feature of heat death isn’t high temperature but the absence of any energy gradients. Without gradients, no processes can occur, no stars can form, and no life can exist. Whether the universe will truly reach this state or settle into some other long-term configuration remains an open question in physics, but the relentless increase of entropy is the driving force behind the scenario.
The Information Connection
Entropy also appears in information theory, where it measures uncertainty rather than physical disorder. Shannon entropy, named deliberately after thermodynamic entropy, quantifies how much you don’t know about the state of a system. A fair coin flip has higher Shannon entropy than a weighted coin because the outcome is less predictable.
The connection to physics is more than just an analogy. If you compress a gas into half its volume, you’ve halved the number of possible states a particle could occupy, which means you need one fewer bit of information to describe its location. Thermodynamic entropy describes how energy is distributed among particles. Shannon entropy describes how information is distributed among possible messages. Both are maximized when the distribution is as spread out and uniform as possible, and both increase when constraints are removed.

