Measuring torque on a rotating shaft typically involves either attaching a sensor directly to the shaft or calculating torque indirectly from other measurable quantities like power, speed, or pressure. The right method depends on your accuracy requirements, budget, whether the shaft is accessible, and whether you need continuous monitoring or a one-time measurement.
Strain Gauge Torque Sensors
Strain gauges are the most widely used method for direct torque measurement on rotating shafts. The principle is straightforward: when a shaft transmits torque, it twists slightly, creating shear stress on the surface. Strain gauges bonded to the shaft detect that microscopic deformation and convert it into an electrical signal proportional to torque.
The gauges are arranged in a Wheatstone bridge circuit, typically oriented at 45 degrees to the shaft axis. This configuration is specifically chosen because it responds to torsional strain while rejecting signals from axial loads (pulling or pushing on the shaft) and bending forces. More sophisticated sensors use multiple bridge circuits to further cancel out the effects of temperature changes and other unwanted loads.
There are two main form factors. An in-line torque transducer is a self-contained unit installed between two sections of a drivetrain, connected by couplings on each side. These are factory-calibrated and highly accurate but require you to break the drivetrain to install them. Alternatively, you can bond strain gauges directly onto an existing shaft, which avoids modifying the mechanical setup but requires more skill to install and calibrate.
Getting Data Off a Spinning Shaft
The fundamental challenge with strain gauges on rotating shafts is getting the electrical signal from the spinning surface to your stationary data acquisition system. Three approaches exist.
Slip rings use physical brushes or contacts riding on rotating rings to maintain an electrical connection. They’re simple and reliable for lower speeds but introduce electrical noise and wear over time. Wireless telemetry systems mount a small transmitter on the shaft that broadcasts the strain signal to a nearby receiver. Modern versions don’t even need batteries: an inductive power supply delivers energy through a stationary loop antenna to a rotating antenna embedded in the collar assembly on the shaft, allowing continuous, uninterrupted measurements without replacing batteries or managing cable connections.
For short-duration testing, battery-powered telemetry modules can be attached directly to the shaft. These are quick to set up but limited by battery life and the added rotating mass.
Calculating Torque From Twist Angle
If you know the material properties of your shaft, you can calculate torque by measuring how much the shaft twists under load. The governing relationship is:
Torque = (Angle of twist × Polar moment of inertia × Shear modulus) / Length
In practice, you mount two encoders or optical sensors at a known distance apart along the shaft. As the shaft transmits torque, the section between the two sensors twists by a tiny angle. Measuring that angular difference, combined with the shaft’s diameter and material stiffness, gives you torque. This method works well on long shaft sections where the twist is large enough to measure reliably, such as propeller shafts on ships or long industrial drive shafts. On short, stiff shafts, the twist angle may be too small to detect accurately.
Indirect Methods: Power and Speed
When you can’t attach anything to the shaft itself, torque can be derived from other measurements. The most common indirect method uses the relationship between power, torque, and rotational speed:
Torque = Power / (2π × Rotational speed)
If you’re measuring a motor, for instance, you can use the motor’s electrical power input (corrected for efficiency) and its RPM to estimate shaft torque. This is less accurate than direct measurement because it depends on knowing the efficiency, which itself varies with load and speed.
For hydraulic systems, torque can be calculated from the pressure differential across a pump or motor and its displacement volume. The relationship is: Torque = (Pressure × Geometric displacement) / (2π). You measure pressure with standard transducers and look up the pump’s displacement from its datasheet. This approach is common in mobile equipment and hydraulic test stands where installing a torque sensor on the shaft would be impractical.
Non-Contact Magnetostrictive Sensors
Magnetostrictive torque sensors offer a genuinely non-contact alternative. They exploit a property of ferromagnetic materials: when steel is stressed (as it is during torque transmission), its magnetic permeability changes. A sensor coil positioned near the shaft surface detects this change and outputs a signal proportional to torque, without ever touching the shaft.
The appeal is obvious: no bonding strain gauges, no slip rings, no telemetry. Since the method works with common structural steels, it doesn’t require special shaft materials. However, this approach comes with significant practical requirements. The shaft material must be thoroughly characterized for its magnetic and mechanical properties before the sensor can be calibrated. The sensor is sensitive to misalignment with the shaft axis, so precise positioning hardware is essential. Testing has also revealed hysteretic behavior, meaning the sensor may give slightly different readings during loading versus unloading. For applications where these limitations are manageable, magnetostrictive sensors provide an economical, easy-to-install option that avoids the complexity of rotating electronics.
Alignment and Installation Tolerances
If you’re installing an in-line torque transducer, alignment between the transducer and the connected shafts is critical. Poor alignment introduces bending moments and side loads that corrupt your torque reading and can damage the transducer’s bearings. Research on transducer misalignment has established that for optimum results, parallel offset should be less than 0.1 mm and angular misalignment should stay below 0.003 degrees.
Flexible couplings on either side of the transducer help accommodate small misalignments, but they aren’t a substitute for careful setup. Use dial indicators or laser alignment tools during installation. Also ensure the couplings are torsionally stiff enough that they don’t absorb part of the torque signal, which would cause you to underread.
Mount the transducer so it’s free from axial thrust loads where possible. If the drivetrain has thermal expansion during operation, allow for axial float on one side to prevent the transducer from becoming a structural load path it wasn’t designed for.
Sampling Rate for Dynamic Torque
Steady-state torque, like a constant load on a conveyor drive, is easy to capture at low sampling rates. Dynamic torque is another matter. Engine crankshafts, reciprocating compressors, and impact tools all produce rapid torque fluctuations that require faster data collection to capture accurately.
Sampling rates in published torque measurement studies range from 100 Hz to over 2,000 Hz. A baseline “gold standard” configuration uses 2,222 Hz sampling with a 150 Hz low-pass filter to clean up noise. For many industrial applications, 1,000 Hz is sufficient to capture torque transients. Even 100 Hz can be acceptable for slower-changing signals, though you’ll miss high-frequency oscillations that may be important for vibration analysis or torsional fatigue assessment.
As a rule of thumb, your sampling rate should be at least 5 to 10 times the highest frequency component in the torque signal you care about. If you’re analyzing torsional vibrations at 200 Hz, sample at 2,000 Hz minimum. Apply a low-pass filter before digitizing to prevent aliasing, where high-frequency noise folds back into your data and creates phantom signals.
Calibration
A torque measurement is only as trustworthy as its calibration. For in-line transducers, calibration typically involves applying known torques using a lever arm and precision weights or a reference transducer, then recording the sensor output at multiple load points in both directions. This process reveals nonlinearity, hysteresis (the difference between loading and unloading readings), and zero drift.
International standards such as DIN 51309 define formal procedures for calibrating torque measuring devices, specifying how to calculate uncertainty and what error sources to account for. If your application involves quality control, compliance testing, or contractual obligations, calibrating to a recognized standard matters. For general monitoring or troubleshooting, a simpler field calibration with a known load may be sufficient, but you should still check zero offset and linearity at a few points across the measurement range.
Recalibrate periodically, especially after any mechanical change to the drivetrain or if the transducer has experienced an overload. Strain gauge sensors can drift over time as adhesive bonds age, and environmental factors like temperature cycling accelerate this process.

