How to Measure Morse Taper With Calipers or Gauges

Measuring a Morse taper comes down to taking two diameter readings and one length measurement, then comparing those numbers to a standard reference table. Whether you’re trying to identify an unknown taper on a lathe spindle or verify that a new drill sleeve is within spec, the process uses basic shop tools most machinists already own.

What You’re Actually Measuring

A Morse taper is a shallow cone, roughly 3 degrees from its centerline, that locks tooling into spindles through friction. There are nine standard sizes (MT0 through MT7, plus a less common MT4.5), and they all share a taper ratio close to 1:20. The differences between sizes are in the large end diameter, small end diameter, and overall taper length. Getting these three numbers lets you identify the size and check whether the taper is accurate.

The taper-per-foot values for all standard sizes cluster tightly between about 0.599 and 0.632 inches per foot. That narrow range is why eyeballing the angle won’t tell you much. The diameters are what distinguish one size from another.

Quick Identification With Calipers

If you just need to figure out which Morse taper you have, a pair of digital or dial calipers will get you there. Measure the large end diameter (the widest point of the cone) and compare it to these standard values:

  • MT0: 9.045 mm (0.356 in)
  • MT1: 12.065 mm (0.475 in)
  • MT2: 17.780 mm (0.700 in)
  • MT3: 23.825 mm (0.938 in)
  • MT4: 31.267 mm (1.231 in)
  • MT5: 44.399 mm (1.748 in)
  • MT6: 63.348 mm (2.494 in)
  • MT7: 83.058 mm (3.270 in)

The gaps between sizes are large enough that even a rough caliper reading will tell you which taper you’re dealing with. An MT2 and MT3, for example, differ by about 6 mm at the large end. You won’t confuse them.

For a more reliable check, also measure the small end diameter and the taper length. The small end diameters range from 6.4 mm for an MT0 up to 69.85 mm for an MT7, and the taper lengths run from about 51 mm (MT0) to 254 mm (MT7). If all three measurements land close to the same row in the table, you’ve confirmed the size.

Checking Taper Accuracy With Math

Knowing the size isn’t always enough. If you’re machining a taper or inspecting one for wear, you need to verify the angle is correct. The formula is straightforward. Measure the large diameter, the small diameter, and the distance between those two measurement points along the axis. Then calculate the taper per inch:

Taper per inch = (large diameter − small diameter) ÷ length

Multiply that result by 12 to get taper per foot, which is how Morse taper specs are traditionally listed. For reference, these are the standard taper-per-foot values:

  • MT0: 0.6246
  • MT1: 0.5986
  • MT2: 0.5994
  • MT3: 0.6024
  • MT4: 0.6233
  • MT5: 0.6315
  • MT6: 0.6257
  • MT7: 0.6240

If you need the actual half-angle of the taper (useful for setting a compound slide or a tailstock offset), take your taper per inch, divide by 2, and find the arctangent. For most Morse tapers, the half-angle works out to roughly 1.4 to 1.5 degrees.

Using a Sine Bar for Precision Work

When you need to verify a taper to tighter tolerances than calipers allow, a sine bar paired with gauge blocks and a dial indicator is the standard shop method. The process works for both internal and external Morse tapers.

Set the taper on a surface plate with a 5-inch sine bar positioned along its length. Stack gauge blocks under one end of the sine bar to approximate the expected taper angle. Then run a dial indicator across the top surface of the taper. If the indicator reading stays at zero as it traverses the full length, the angle is correct. If it drifts, adjust the gauge block stack up or down until the indicator reads zero across the entire surface.

Once you’ve zeroed the indicator, divide the final gauge block height by the sine bar length (typically 5 inches). That gives you the sine of the included angle. Use a calculator or trig table to convert that sine value back to degrees. To convert the angle into taper per foot for comparison against the spec, use the formula: taper per foot = 24 × tangent(half angle). This method can resolve angular errors well below a tenth of a degree, which matters when you’re chasing a tight fit between a spindle and its tooling.

Using a Micrometer and Known Rings or Plugs

Another practical approach uses a micrometer and two precision ground rings (for external tapers) or plugs (for internal tapers). Slide a ring gauge onto the taper and scribe or note where it sits. Measure the diameter at that point with the micrometer. Then move to a second point a known distance away and measure again. The difference between those two diameters, divided by the distance between them, gives you the taper per inch directly.

For external tapers, you can also place two precision pins or rollers of equal diameter against the taper surface and measure across them with a micrometer. By taking this measurement at two different positions along the taper’s length, you get the two data points needed to calculate the taper rate. This technique is especially useful for tapers that are too small or too deeply recessed to reach with calipers.

Morse Taper Gauge Sets

The fastest way to check a Morse taper in production or routine maintenance is with a dedicated taper gauge, either a plug gauge for internal tapers or a ring gauge for external ones. These are hardened, ground tools made to the exact Morse taper spec. You insert the gauge and check where a scribe line on the gauge aligns relative to the end of the socket or shank. If the line sits within the marked tolerance band, the taper is within spec. If it seats too deep or too shallow, the taper has been worn or machined incorrectly.

Taper gauges are the gold standard in machine shops because they test both the angle and the diameter simultaneously. A caliper measurement might confirm the right angle but miss a diameter that’s a few thousandths oversize. The gauge catches both errors at once. They’re available for all standard Morse taper sizes from tooling suppliers and are worth the investment if you regularly work with taper-mounted tooling.

Internal vs. External Tapers

Measuring an external Morse taper (like a drill shank or a lathe center) is simpler because you can access the surface directly with calipers or a micrometer. Internal tapers, like the bore of a lathe headstock or tailstock, are harder to reach. For internal measurements, a telescoping gauge or bore gauge gives you diameter readings at two points, and you calculate from there the same way.

A transfer method also works well for internal tapers. Coat the inside of the socket with a thin layer of layout dye, insert a known-good taper gauge or a test plug, rotate it slightly, and withdraw it. The contact pattern in the dye shows whether the taper is matching evenly. Full, even contact along the length means the angle is correct. A pattern that only touches at the large end or only at the small end indicates the socket’s taper rate is off. This bluing test doesn’t give you a number, but it’s the quickest way to determine whether an internal taper needs recutting.

Common Measurement Mistakes

The most frequent error is measuring diameter at the wrong points. Morse taper shanks often have a flat tang at the small end, and the large end may have a slight chamfer or radius. Both of these will throw off your readings if you accidentally include them. Take your diameter measurements on the clean, cylindrical-looking portion of the taper surface, away from any transitions.

Another common issue is not accounting for wear. A well-used spindle taper may measure correctly at the opening but be slightly enlarged deeper inside from years of inserting and removing tooling. A single diameter check won’t catch this. You need at least two measurements at different depths, or a bluing test, to confirm the taper is still uniform along its full length.