Engine torque is measured by applying a braking force to the engine’s output shaft and recording how much rotational force it produces. The most common tool for this is a dynamometer, which comes in several types depending on whether you’re testing a bare engine on a bench or a complete vehicle on rollers. There are also indirect methods, from OBD-II scanners that read the engine computer’s torque estimate to mathematical calculations based on horsepower and RPM.
The Core Formula Behind Torque
Before getting into measurement tools, it helps to understand the math that ties torque to horsepower. The relationship is: Horsepower = Torque × RPM ÷ 5,252. That 5,252 constant is just a unit-conversion factor that makes pound-feet and RPM produce horsepower. Rearranged, Torque = Horsepower × 5,252 ÷ RPM. This means if you already know an engine’s horsepower at a given RPM (from a dyno sheet, for example), you can calculate torque without measuring it directly. It also means torque and horsepower always cross at exactly 5,252 RPM on any dyno chart, which is a useful sanity check when reading results.
Engine Dynamometers
An engine dynamometer tests the engine by itself, removed from the vehicle and bolted to a test stand. The engine’s crankshaft connects directly to an absorber, a device that resists the engine’s rotation in a controlled way. A load cell mounted on the absorber housing measures the reaction force, and since the distance from the crankshaft centerline to the load cell is known, the dyno calculates torque directly.
The two most common absorber types are water brakes and eddy current units. A water brake churns water inside its housing, transferring energy through momentum and shear. The more water flowing through the housing, the greater the braking force. These are versatile and cover a huge range, from small engines up to 10,000 horsepower or more. They work well for sweep tests where the engine accelerates through its RPM range, though they can be tricky when you need to hold a precise, steady load at one specific RPM.
Eddy current absorbers use a different principle. Rotating metallic discs spin inside a magnetic field, which induces electrical currents that resist motion. By adjusting the strength of the magnetic field, the operator can set an exact load at any RPM. This makes eddy current dynos better for detailed fuel injection mapping and steady-state testing where you need the engine locked at a specific speed under a specific load.
Because engine dynos measure output at the crankshaft with no transmission, driveshaft, or differential in the way, they give you “brake torque,” the figure manufacturers use in spec sheets. This is the most accurate way to measure what an engine actually produces.
Chassis Dynamometers
A chassis dyno measures torque at the driven wheels. You drive the vehicle onto large rollers (typically one pair for two-wheel drive, two pairs for all-wheel drive), strap the car down, and run it through its gears while the rollers resist the wheels’ rotation. The result is “wheel torque” or “wheel horsepower,” which is always lower than the engine’s true output because the drivetrain absorbs some energy along the way.
How much energy the drivetrain absorbs depends on the vehicle’s layout. In a front-wheel-drive car, where the torque path is short and uses efficient helical gears, total drivetrain losses can be roughly half those of a rear-wheel-drive car. Rear-wheel-drive vehicles lose 6 to 10 percent of power in the differential alone, because hypoid gearsets (which turn the torque path 90 degrees) create significant friction. Driveshafts and prop shafts add another 0.5 to 1 percent. One exception: when testing in the transmission’s direct-drive gear (the 1:1 ratio, usually fourth in a manual), the power goes straight through the main shaft with minimal loss, and total at-the-wheels loss can drop to as little as 1.5 to 2 percent.
Chassis dynos come in two styles. An inertia dyno uses only the mass of its heavy rollers to resist the engine. You floor it, and the rate at which the engine accelerates those rollers reveals its torque curve. This is simple and repeatable, but you can’t simulate real driving conditions like holding a steady speed on a hill. An eddy current or “loaded” chassis dyno adds a magnetic or hydraulic brake to the rollers, letting the operator apply any load at any speed. This is far more useful for tuning fuel maps and diagnosing drivability issues.
Beyond pure measurement, chassis dynos are valuable for catching mechanical problems. Overheating, drivetrain leaks, failing U-joints, tire failures, and electrical gremlins all tend to reveal themselves under sustained load on rollers, making it a useful final check before a built engine hits the road.
Inline Torque Transducers
For direct, real-time measurement of torque on a rotating shaft, engineers use inline torque transducers. These are precision sensors that bolt directly into the driveline, between the engine and gearbox or between the gearbox and wheels, and measure the tiny twist (strain) in the shaft as torque passes through it. Modern versions are compact, lightweight, and often wireless, eliminating the need for cables running to a spinning component. They’re common in motor test stands, component testing rigs, and R&D settings where engineers need continuous torque data during actual operation rather than a single peak number. For most enthusiasts this is overkill, but it’s how automakers validate drivetrain components during development.
OBD-II Torque Estimation
Your engine’s computer already estimates torque in real time, and you can read it with an inexpensive OBD-II scanner. Two standard diagnostic parameters are directly relevant: “demanded engine percent torque” (what the computer is targeting) and “actual engine percent torque” (what it calculates is happening right now). Both are expressed as a percentage of the engine’s maximum rated torque.
The ECU arrives at these estimates using inputs it already monitors: mass airflow (how much air is entering the engine, measured in grams per second), ignition timing advance (in degrees), throttle position, and fuel delivery. It cross-references these against internal calibration tables to produce a torque figure. This isn’t a direct physical measurement like a dyno, so it won’t catch every nuance, but it’s surprisingly useful for spotting drops in performance, verifying that a tune is producing the expected gains, or diagnosing issues like a failing sensor that’s causing the ECU to pull timing and reduce torque.
Brake Torque vs. Indicated Torque
When reading torque specs, it matters where in the engine the measurement applies. “Indicated torque” is the theoretical torque generated inside the cylinders by combustion pressure pushing down on the pistons. It’s calculated from cylinder pressure data and represents the engine’s gross thermodynamic output. “Brake torque” is what’s left after internal friction, the oil pump, water pump, valve train, and other parasitic loads take their share. Brake torque is measured at the crankshaft and is the number you see on a manufacturer’s spec sheet.
One layer further out, “wheel torque” is brake torque minus everything the transmission, driveshaft, and differential consume. So the same engine will always show the highest number as indicated torque, a lower number as brake torque, and the lowest as wheel torque. If someone quotes a torque figure without specifying which type, it’s almost certainly brake torque.
SAE Testing Standards
Manufacturer torque ratings in the U.S. follow one of two SAE standards. SAE J1349 measures “net” power and torque with all production accessories attached and running: alternator, power steering pump, water pump, air filter, exhaust system. This reflects what the engine actually delivers as installed in a vehicle. SAE J1995 measures “gross” output with accessories removed or disconnected, which produces a higher number. Since 1972, U.S. automakers have been required to advertise net figures, so any modern spec sheet uses J1349. If you’re comparing a vintage car’s rated torque to a modern one, keep in mind the older number is likely gross and could be 15 to 20 percent higher than a net test would show.
Keeping Measurements Accurate
Dyno results are only as good as the calibration of the load cell that reads the force. Professional dynamometer facilities calibrate their load cells by hanging certified weights from a lever arm at a known distance from the measurement point. The weights must meet strict tolerance standards, typically within 0.01 to 0.016 percent of their stated mass, and must be traceable to a national measurement institute. If the dyno uses a lever to multiply force (common on high-torque setups), the lever ratio itself becomes part of the calibration equation.
For consistent results across different sessions or different dyno shops, environmental conditions matter too. Air temperature, barometric pressure, and humidity all affect how much oxygen the engine takes in, which directly changes its output. SAE correction factors adjust the raw numbers to a standard set of atmospheric conditions, making it possible to compare a dyno pull in Denver (5,000 feet above sea level, thin air) to one in Miami (sea level, dense air). When comparing dyno sheets, always check whether the numbers are corrected or uncorrected, and whether the test was done on a chassis dyno (wheel figures) or an engine dyno (crank figures). Mixing the two without accounting for drivetrain losses is the most common source of confusion in torque comparisons.

