What Is Tolerance in Manufacturing? Limits, Fits & GD&T

Tolerance in manufacturing is the maximum amount a part’s dimension can vary from its intended size and still function correctly. Every manufacturing process introduces small variations, so tolerances define the acceptable range of those variations. A part designed to be 50 mm long with a tolerance of ±0.2 mm, for example, can measure anywhere from 49.8 mm to 50.2 mm and still pass inspection.

Tolerances exist because no machine can produce a perfectly exact dimension every single time. They give manufacturers a controlled window of acceptable error, which keeps parts interchangeable, assemblies functional, and costs manageable.

Why Tolerances Matter

The most practical reason for tolerances is interchangeability. When you replace a single bolt, bearing, or bracket in a machine, you expect the new part to fit without custom adjustment. Tolerances make that possible by ensuring every copy of a part falls within the same dimensional range. Without them, every assembly would require hand-fitting, which is slow and expensive.

Tolerances also prevent problems during the design phase. Engineers use tolerance analysis to figure out how small variations in individual parts add up (called “stack-up”) when those parts are assembled together. This process helps eliminate the need for shims, spacers, and tooling revisions that add cost without adding structural value.

Types of Tolerances

Tolerances come in a few common formats, each suited to different situations.

Bilateral tolerance allows variation in both directions from the target dimension. A 25 mm dimension with ±0.1 mm tolerance means the part can be 24.9 mm to 25.1 mm. This is the most common format on production drawings.

Unilateral tolerance allows variation in only one direction. A hole might be specified as 25 mm with +0.2 / 0.0 mm, meaning it can be larger than 25 mm but not smaller. This is useful when a dimension needs to err on one side for functional reasons, like ensuring a shaft always slides into a hole.

Limit tolerance skips the nominal value entirely and just states the upper and lower boundaries directly: 25.0 mm to 25.2 mm. The total tolerance is the difference between those two numbers.

Geometric Tolerances (GD&T)

Sometimes controlling a part’s size isn’t enough. A flat surface could be the right length and width but still be warped. A hole could be the right diameter but drilled in the wrong spot. Geometric Dimensioning and Tolerancing, commonly called GD&T, handles these situations by controlling the shape, orientation, and location of features rather than just their size.

GD&T uses standardized symbols on engineering drawings, each controlling a specific geometric property. Flatness ensures a surface doesn’t bow or warp beyond a set limit. Parallelism controls how much a surface can tilt relative to a reference surface. Position controls where a feature like a hole sits relative to other features or reference points. These tolerances fill the gaps that simple size tolerances can’t cover.

The governing standard in the United States is ASME Y14.5-2018, which was reaffirmed in 2024 and remains the current active edition.

How Fits Work Between Mating Parts

When two parts need to go together, like a shaft sliding into a hole, engineers specify a “fit” that defines the relationship between them. The tolerance on each part determines what kind of fit results.

  • Clearance fit: The shaft is always smaller than the hole across all tolerance conditions, so there’s always a gap. Think of a door hinge pin that needs to rotate freely.
  • Interference fit: The shaft is always slightly larger than the hole, so it must be pressed or heated into place. This creates a tight, permanent joint without fasteners.
  • Transition fit: Depending on where each part falls within its tolerance range, the result could be a slight clearance or a slight interference. These are used when precise alignment matters but the joint doesn’t need to be permanent.

Most industries use a “hole-basis” system, where the hole size stays fixed at its minimum and the shaft size is adjusted to achieve the desired fit. This approach is simpler because holes are harder to adjust after machining than shafts are.

Standard Tolerance Classes

Not every dimension on a drawing needs a custom tolerance callout. ISO 2768 defines general tolerance classes that apply to any dimension not individually specified. The three most common classes, from tightest to loosest, are fine, medium, and coarse.

For a part dimension between 30 mm and 120 mm, the fine class allows ±0.15 mm of variation, medium allows ±0.3 mm, and coarse allows ±0.8 mm. For smaller features between 0.5 mm and 3 mm, fine permits only ±0.05 mm while coarse allows ±0.2 mm. At the large end, a dimension between 1,000 mm and 2,000 mm gets ±0.5 mm under fine and ±3.0 mm under coarse.

Angular dimensions follow a similar pattern. A short feature (under 10 mm) in the coarse class can vary by ±1°30′, while a longer feature (over 400 mm) in the fine class is held to just ±0°5′. Engineers pick the tolerance class that matches the function of the part, using tighter classes only where they’re needed.

What Different Processes Can Achieve

Your choice of manufacturing process sets the floor for how tight your tolerances can realistically be. CNC machining is the benchmark for precision, capable of holding tolerances as tight as ±0.025 mm. That’s about one-third the thickness of a human hair.

Standard 3D printing (FDM, SLA, SLS) typically achieves ±0.1 mm to ±0.5 mm, which is fine for prototypes and non-critical parts but too loose for precision assemblies. Industrial 3D printers can push down to ±0.025 mm to ±0.05 mm, though at significantly higher cost. Injection molding generally falls between CNC and consumer 3D printing, with achievable tolerances depending heavily on part geometry and material shrinkage.

Tighter Tolerances Cost More (Exponentially)

There’s a well-known cost curve in manufacturing: tightening tolerances increases cost slowly at first, then steeply. Research on tolerance allocation in mechanical assemblies illustrates this clearly. Holding a tolerance of about 0.1 mm keeps costs roughly stable compared to a looser 0.5 mm specification. But tightening to ±0.02 mm (an IT6 grade) roughly triples the cost per part, and pushing to ±0.01 mm can triple it again.

The cost increase comes from multiple directions. Tighter tolerances require slower machining speeds, more expensive tooling, more frequent tool replacement, higher scrap rates, and more time-consuming inspection. The most cost-effective approach is to use tight tolerances only on the dimensions that truly affect function and leave everything else at a general tolerance class.

How Temperature Affects Tolerances

Materials expand when heated and contract when cooled. For parts with tolerances in the hundredths of a millimeter, this thermal expansion can push a dimension out of spec. A steel part measured in a warm shop might read differently than the same part measured in a climate-controlled inspection room.

If thermal expansion contributes 5% or more of the total measurement uncertainty, it’s considered significant enough to account for. The practical steps to manage it include allowing parts to stabilize at room temperature before measuring, keeping inspection areas away from heat sources like windows, HVAC vents, and equipment that generates heat, and monitoring the part’s actual temperature rather than just the air temperature (since metal changes temperature more slowly than the surrounding air).

How Tolerances Are Verified

Simple tolerances on common features can be checked with handheld tools like calipers and micrometers. For tighter tolerances and complex geometries, manufacturers use Coordinate Measuring Machines (CMMs), which probe a part’s surfaces in three dimensions and compare the results to the design specifications.

A key principle in tolerance inspection is the uncertainty-to-tolerance ratio. Your measuring equipment needs to be significantly more precise than the tolerance you’re checking. The most common ratios range from 1:5 to 1:10, meaning if you’re verifying a ±0.1 mm tolerance, your CMM should be accurate to at least ±0.02 mm. Using a measurement tool that’s barely more precise than the tolerance itself makes it impossible to know whether a borderline part truly passes or fails.