How to Measure Specific Heat Capacity Accurately

Specific heat capacity is measured by tracking how much energy it takes to raise the temperature of a known mass of material by a known amount. The core formula is simple: divide the energy transferred by the mass of the substance multiplied by its temperature change (c = Q / m × ΔT). The real skill is in how you measure those three quantities accurately, and there are several practical methods depending on what equipment you have and how precise you need to be.

The Core Formula

Every method for measuring specific heat capacity comes back to the same relationship. Heat capacity (C) is the energy transferred per unit of temperature change: C = Q / ΔT. Specific heat capacity (c) takes this one step further by dividing by mass, giving you a value that’s characteristic of the material itself rather than a particular object:

c = Q / (m × ΔT)

Here, Q is the energy transferred (in joules), m is the mass of the substance (in kilograms), and ΔT is the change in temperature (in degrees Celsius or kelvins, since the size of one degree is the same on both scales). The standard SI unit for specific heat capacity is J/kg·K. In chemistry, you’ll often see it expressed per gram (J/g·°C) or per mole (J/mol·°C), but the underlying measurement is identical.

To put those numbers in perspective, water has a specific heat capacity of 4,184 J/kg·K, meaning it takes over four thousand joules to raise one kilogram of water by a single degree. Aluminum sits at 890 J/kg·K, iron at 450, and copper at just 385. This is why a metal pan heats up so much faster than the water inside it.

Method of Mixtures (Calorimetry)

The most common hands-on approach, and the one you’ll likely encounter in a school or university lab, is the method of mixtures. The idea is straightforward: heat a sample to a known temperature, drop it into cooler water, and measure what happens. Energy lost by the hot object equals energy gained by the water, and since you already know water’s specific heat capacity, you can solve for the unknown.

Here’s how it works step by step. Measure the mass of your sample (around 30 g of metal is typical for a lab exercise). Place it in a test tube or directly into a boiling water bath and keep it there for at least five minutes so you can safely assume it has reached 100.0°C. While the sample heats, measure out a known volume of room-temperature water, usually about 50 mL, and pour it into a calorimeter. A simple calorimeter can be two nested Styrofoam cups with a lid and a thermometer inserted through the top. Record the water’s starting temperature (Tc).

Quickly transfer the hot sample into the calorimeter and seal the lid. Watch the thermometer and record the highest temperature the water reaches (Tmax). Speed matters here, because every second the hot sample spends in the air, it’s losing heat to the room instead of to the water.

Now run the numbers. The temperature change of the water is Tmax minus Tc. The temperature change of the metal is 100.0°C minus Tmax. Set the energy equations equal to each other:

m_water × c_water × (Tmax − Tc) = m_metal × c_metal × (100.0 − Tmax)

Rearrange to solve for c_metal, and you have your answer. If your metal sample is aluminum, you should land somewhere near 890 J/kg·K. Copper should come in around 385 J/kg·K.

The Electrical Heating Method

The electrical method gives you more control over exactly how much energy enters the system. Instead of relying on a boiling water bath, you use an immersion heater connected to a power supply and measure the electrical energy directly.

Start with a solid block of the material you’re testing, one that has a hole drilled for the heater and a second, smaller hole for a thermometer. Place a few drops of oil into the thermometer hole so the bulb makes good thermal contact with the block. Wrap the entire block loosely in cotton wool or another insulating material to reduce heat escaping to the surroundings.

Record the block’s starting temperature, then turn on the heater. Note the ammeter reading (current in amps) and the voltmeter reading (voltage in volts). Run the heater for a set time, typically ten minutes. The electrical energy transferred is calculated as:

E = V × I × t

where V is voltage, I is current, and t is time in seconds. After you switch off the heater, keep watching the thermometer. The temperature will continue to rise briefly as residual heat spreads through the block before it starts to cool. Record the highest temperature it reaches and use that as your final temperature.

Plug everything into the specific heat capacity formula: c = E / (m × ΔT). This method works for liquids too. Place the immersion heater in a known mass of liquid inside an insulated container and follow the same process.

Getting Accurate Results

Both methods are simple in principle but surprisingly easy to get wrong. The biggest source of error in any calorimetry experiment is heat loss to the surroundings. Every surface that touches the air is leaking energy, and that leaked energy won’t show up in your temperature readings, making your calculated specific heat capacity too low.

Insulation is your first line of defense. Styrofoam cups work well for the mixing method because they conduct heat poorly. For the electrical method, wrapping the block in cotton wool makes a noticeable difference. In research-grade calorimeters, the jacket surrounding the sample is kept at a temperature close to the reaction vessel specifically to minimize heat flow in either direction.

Stirring matters more than most people realize. In liquid calorimetry, inadequate stirring creates temperature gradients, meaning the thermometer reads the temperature of the water nearest to it rather than the true average temperature of the whole system. Research at the National Bureau of Standards identified inadequate stirring as a primary source of systematic error in solution calorimetry. If you’re using the mixing method, gently swirl the calorimeter after adding the hot sample.

Transfer speed is critical in the mixing method. The moment you pull a metal sample out of boiling water, it begins cooling in the air. A delay of even a few seconds can noticeably lower your result. Use tongs and move quickly, but safely.

Condensation and evaporation also introduce hidden errors. Water evaporating from the surface of a calorimeter carries away energy, and water vapor condensing on cooler surfaces releases it in the wrong place. Keeping a lid on the calorimeter reduces both effects.

Differential Scanning Calorimetry

For professional materials science, biology, or pharmaceutical work, specific heat capacity is typically measured with an instrument called a differential scanning calorimeter (DSC). This device heats two small pans at the same rate: one containing your sample, and one empty (or filled with a reference buffer). The instrument measures the difference in energy required to keep both pans at the same temperature as they warm up.

There are two main designs. In a heat flux DSC, both pans sit on a single heating platform and the instrument measures the temperature difference between them, then converts that to a heat flow using the thermal equivalent of Ohm’s law. In a power-compensated DSC, each pan has its own independent heater, and the instrument directly measures how much extra power the sample pan needs.

DSC is the standard tool for measuring heat capacity in materials that are expensive, available only in tiny quantities, or need to be characterized across a wide temperature range. A single scan can map how a material’s heat capacity changes from well below room temperature to several hundred degrees. The tradeoff is cost: a DSC instrument is a significant investment, far beyond what’s needed for a classroom experiment.

Checking Your Results

The easiest way to know whether your measurement technique is working is to test a material with a well-known specific heat capacity and see how close you get. Water at 4,184 J/kg·K is the classic benchmark. Other useful reference materials and their accepted values:

  • Aluminum: 890 J/kg·K
  • Iron: 450 J/kg·K
  • Copper: 385 J/kg·K
  • Lead: 129 J/kg·K

If your measured value for copper comes out at 350 or 420 instead of 385, that’s a reasonable result for a simple lab setup. If you’re off by a factor of two, something went wrong, most likely significant heat loss or an error in mass or temperature measurement. Note that water exists in three common states with very different heat capacities: ice at 0°C has a specific heat of 2,090 J/kg·K, liquid water sits at 4,184 J/kg·K, and steam at 100°C drops to 2,030 J/kg·K. Make sure you’re comparing to the right value for the right phase.