The harmonic mean is the right choice whenever you’re averaging rates, ratios, or quantities that are defined relative to something else. If your data points are “something per something” (miles per hour, price per dollar of earnings, beats per minute), the harmonic mean usually gives you the correct average while the arithmetic mean gives you a misleading one. The core principle: use the harmonic mean when equal weight should be given to each unit in the denominator, not each unit in the numerator.
What the Harmonic Mean Actually Does
The harmonic mean is the reciprocal of the arithmetic mean of the reciprocals. In practice, you flip each value, average those flipped values, then flip the result back. For a set of numbers, that looks like: divide the count of values by the sum of their reciprocals. So the harmonic mean of 1, 4, and 4 is 3 divided by (1/1 + 1/4 + 1/4), which equals 3 divided by 1.5, which equals 2. The arithmetic mean of those same numbers is 3.
That gap matters. The harmonic mean is always less than or equal to the arithmetic mean for positive numbers. This is part of a fundamental mathematical relationship: the arithmetic mean is always the largest, the geometric mean sits in the middle, and the harmonic mean is the smallest. The more spread out your values are, the wider the gap between these three averages. This isn’t a flaw. It reflects the harmonic mean’s sensitivity to small values, which is exactly why it works well for rates and ratios where small values carry disproportionate real-world weight.
Averaging Speeds Over Equal Distances
The classic example is average speed. If you drive 60 miles at 30 mph and then 60 miles at 60 mph, your average speed is not 45 mph (the arithmetic mean). The first leg took 2 hours and the second took 1 hour, so you covered 120 miles in 3 hours: an average of 40 mph. That’s the harmonic mean of 30 and 60.
The arithmetic mean fails here because it treats each speed as equally important, but the slower speed dominates your total travel time. You spent twice as long driving at 30 mph, so it should pull the average down more. The harmonic mean automatically accounts for this by giving more weight to smaller values. Any time you’re averaging speeds (or rates of any kind) across equal distances or equal units of output, the harmonic mean is the correct tool. Research comparing all three means on constant-distance speed data confirms that only the harmonic mean recovers the true average speed.
Financial Ratios Like P/E
In finance, the harmonic mean solves a similar problem with price-to-earnings (P/E) ratios. If you want the average P/E ratio for a group of stocks, the arithmetic mean is biased upward. A single company with a very high P/E (say, 200) can drag the average far above what’s representative of the group. This happens because the arithmetic mean effectively weights each company’s earnings equally, which makes no economic sense when companies have vastly different earnings levels.
The harmonic mean of P/E ratios is equivalent to dividing the total price by total earnings across the group, giving you an equal-dollar-weighted average. Research from the University of Maine advocates using the harmonic mean for this purpose, noting that the arithmetic mean “is biased upwards and cannot be numerically justified” for non-price-normalized ratios. The same logic applies to any financial ratio where the denominator varies across observations: price-to-book, price-to-sales, or cost-per-click in advertising.
The F1 Score in Machine Learning
If you work with classification models, you’ve already used the harmonic mean without thinking about it. The F1 score is the harmonic mean of precision and recall. Precision measures how many of your positive predictions were correct. Recall measures how many actual positives you caught. The F1 score reaches its best value at 1 and its worst at 0.
Why not just average precision and recall with the arithmetic mean? Because the harmonic mean penalizes imbalance. If your model has 95% precision but only 10% recall (it’s very accurate when it makes a prediction, but it misses almost everything), the arithmetic mean would be a deceptively comfortable 52.5%. The harmonic mean gives you 18.2%, which better reflects how poorly the model is actually performing. The harmonic mean forces both metrics to be reasonably high before the combined score looks good.
Population Genetics and Biology
In genetics, the “effective population size” of a species is calculated as the harmonic mean of its population size across generations. This matters because a species that numbered 10,000 for nine generations but crashed to 100 for one generation does not behave genetically like a population of roughly 9,000 (the arithmetic mean). The bottleneck generation has an outsized effect on genetic diversity, and the harmonic mean captures this. For that example, the effective population size works out to around 900, far below the arithmetic average.
Nature Education notes that the harmonic mean is “always lower than the arithmetic mean (often considerably lower), and it is especially sensitive to the lowest values.” This sensitivity is a feature, not a bug. It correctly reflects the biological reality that a single population crash can permanently reduce genetic variation, regardless of how large the population was before or after.
Parallel Circuits and Combined Capacities
When resistors are wired in parallel, the total resistance follows the same reciprocal pattern as the harmonic mean. For two resistors of equal value, the combined resistance is half of one resistor. For resistors of unequal value, the combined resistance is always less than the smallest individual resistor. The formula is identical in structure: take the reciprocal of the sum of reciprocals.
This principle extends to any system where multiple paths or resources operate in parallel. Processing speeds across parallel servers, flow rates through parallel pipes, and heat transfer through layered materials all follow harmonic-mean logic. The common thread is that the bottleneck (the smallest or slowest component) has a disproportionate effect on the combined result.
A Quick Decision Checklist
Use the harmonic mean when your data meets one or more of these conditions:
- You’re averaging rates or ratios. Speed, productivity (units per hour), price-per-unit ratios, or any “X per Y” measurement where Y varies across observations.
- The denominator should carry equal weight. For example, when averaging fuel efficiency across trips of equal distance, each mile should count equally, not each trip.
- Small values matter more than large ones. The harmonic mean naturally downweights large outliers and is sensitive to small values. If a single low value has real consequences (a population bottleneck, a slow server in a cluster), the harmonic mean reflects that.
- You’re combining reciprocal quantities. Parallel resistance, combined lens focal lengths, or any physics problem where the governing equation uses reciprocals.
If your data points are simple measurements (heights, weights, test scores) where each observation carries equal importance on its own, the arithmetic mean is fine. If your data is multiplicative or spans several orders of magnitude (bacterial growth, investment returns over time), the geometric mean is the better fit. The harmonic mean occupies a specific and important niche: it’s the correct average when the structure of your data involves rates, reciprocals, or ratios where the denominator isn’t constant.

