What Is Pressure in Chemistry? Definition and Units

Pressure in chemistry is the force that particles exert on the walls of their container, measured as force per unit area. It plays a central role in how gases behave, when liquids boil, and whether chemical reactions favor products or reactants. The basic formula is simple: pressure equals force divided by area. But in chemistry, pressure shows up in gas laws, phase changes, and industrial processes, making it one of the most practical concepts to understand.

How Gas Particles Create Pressure

At the molecular level, pressure comes from trillions of tiny collisions. Gas particles move constantly and randomly, and every time one strikes the wall of its container, it pushes against that wall with a small force. Add up all those collisions happening every second across the entire surface, and you get a measurable pressure.

This picture of pressure comes from kinetic molecular theory, and it explains several things intuitively. Pump more gas into a rigid container and you increase the number of particles hitting the walls per second, which raises the pressure. Heat the gas and the particles move faster, hitting the walls harder and more often, which also raises the pressure. Let the container expand and those same collisions spread over a larger area, so the pressure drops.

Units of Pressure

Chemistry uses several pressure units depending on the context, which can be confusing at first. The SI unit is the pascal (Pa), defined as one newton of force per square meter. Because a single pascal is tiny, you’ll often see kilopascals (kPa) in textbook problems.

Other common units include:

  • Atmosphere (atm): based on average air pressure at sea level. 1 atm = 101,325 Pa.
  • Bar: slightly less than one atmosphere. 1 bar = 100,000 Pa. IUPAC now defines standard pressure as exactly 1 bar rather than 1 atm.
  • Torr (or mmHg): originally tied to the height of mercury in a barometer. 1 atm = 760 torr.
  • Pounds per square inch (psi): common in engineering. 1 atm ≈ 14.70 psi.

When solving gas law problems, the unit of pressure you choose determines which value of the gas constant R you plug in: 0.08205 L·atm/(mol·K) for atmospheres, 8.3145 L·kPa/(mol·K) for kilopascals, or 62.364 L·torr/(mol·K) for torr. Mixing units is one of the most common mistakes in general chemistry courses.

Standard Temperature and Pressure

You’ll frequently see “STP” in gas calculations. The current IUPAC definition sets standard temperature at 273.15 K (0 °C) and standard pressure at 100,000 Pa (1 bar). Older textbooks sometimes use 1 atm as the standard, which is about 1.3% higher. If your course material still uses 1 atm for STP, stick with that for consistency, but know that the official benchmark shifted to 1 bar.

The Ideal Gas Law

The single most important equation connecting pressure to other gas properties is the ideal gas law: PV = nRT. Here P is pressure, V is volume, n is the number of moles of gas, R is the gas constant, and T is absolute temperature in kelvins. This equation lets you calculate any one of those four variables if you know the other three.

The ideal gas law is really three older laws rolled into one. Boyle’s law says that at constant temperature and amount of gas, pressure and volume are inversely proportional: squeeze the volume in half and the pressure doubles. Charles’s law says that at constant pressure, volume is proportional to temperature. Avogadro’s law says that at constant temperature and pressure, volume is proportional to the number of moles. Each of these is just a special case where two of the four variables are held fixed.

Real gases follow this equation well under everyday conditions. At extremely high pressures or very low temperatures, gas particles interact with each other more strongly and the ideal gas law becomes less accurate, but for most chemistry coursework it works reliably.

Partial Pressures in Gas Mixtures

When multiple gases share the same container, each one contributes its own pressure independently. The total pressure is simply the sum of each gas’s individual (partial) pressure. This is Dalton’s law of partial pressures: P(total) = P₁ + P₂ + P₃, and so on for however many gases are present.

Each partial pressure depends on how many moles of that particular gas are in the mixture. If nitrogen makes up 78% of the molecules in a container at 1 atm total pressure, nitrogen’s partial pressure is about 0.78 atm. This concept shows up constantly in chemistry, from calculating the yield of a gas collected over water (where water vapor contributes its own partial pressure) to understanding how oxygen is delivered in the lungs.

Pressure and Boiling Points

Pressure doesn’t just matter for gases. It controls when liquids boil. A liquid boils at the temperature where its vapor pressure, the pressure created by molecules escaping the liquid surface, equals the external pressure pushing down on it.

At 1 atm, water boils at 100 °C because that’s the temperature where water’s vapor pressure reaches 1 atm. Reduce the external pressure and the liquid boils at a lower temperature. This is why water boils below 100 °C at high altitudes, where atmospheric pressure is lower. Increase the external pressure and you need a higher temperature to reach boiling, which is the principle behind pressure cookers.

Pressure and Chemical Equilibrium

For reactions involving gases, changing the pressure can shift the balance between products and reactants. Le Chatelier’s principle predicts that if you increase the pressure on a system at equilibrium, the reaction shifts toward whichever side has fewer total moles of gas, because that reduces the pressure the system experiences.

The industrial production of ammonia (the Haber process) is a classic example. The reaction combines nitrogen and hydrogen gas to produce ammonia, going from four total moles of gas on the reactant side to two moles on the product side. Running the reaction at high pressure pushes the equilibrium toward ammonia. In practice, Haber process plants operate at 150 to 300 bar, with a typical setting around 200 bar. That’s roughly 200 times atmospheric pressure, and it’s one of the reasons this process requires massive, thick-walled reactors.

How Pressure Is Measured

Two classic instruments measure pressure in different ways. A barometer measures atmospheric pressure using a column of mercury in a sealed tube. The atmosphere pushes down on an open reservoir of mercury, forcing it up the tube. The height of that mercury column, typically about 760 mm at sea level, directly reflects atmospheric pressure. This is where the unit “mmHg” comes from.

A manometer measures the pressure of a gas sample, usually by comparing it to atmospheric pressure. In its simplest form, it’s a U-shaped tube partially filled with mercury. The gas pushes down on one side, the atmosphere pushes down on the other, and the difference in mercury height between the two sides tells you how much the gas pressure exceeds (or falls below) atmospheric pressure. Both instruments rely on the same physics: the pressure at the bottom of a fluid column equals the height of the column times the fluid’s density times gravitational acceleration.