The right way to measure thickness depends on what you’re measuring and how precise you need to be. A stack of paper, a steel plate, a coat of paint, and a human cornea all require completely different tools and techniques. This guide covers the most common methods, from simple household approaches to precision instruments, so you can pick the one that fits your situation.
The Stack-and-Divide Method for Thin Sheets
If you need to measure something too thin for a ruler, like a single sheet of paper, plastic film, or foil, the simplest approach is to stack many identical sheets together. Measure the total height of the stack with a ruler, then divide by the number of sheets. Fifty sheets of copy paper might measure about 5 mm total, giving you a per-sheet thickness of 0.1 mm. The more sheets you stack, the more accurate your result, because any small error in your ruler reading gets spread across a larger number of divisions.
This works for any material you can stack evenly: fabric layers, gaskets, shims, or plastic wrap. Just make sure the sheets are flat and compressed uniformly, with no air gaps inflating the measurement.
Calipers for General-Purpose Precision
A vernier caliper is the most versatile thickness tool you can own. It consists of a main scale and a sliding vernier scale, with jaws that close around your material. You can measure outside thickness, inside diameter, and depth, all with one instrument. Calipers come in three precision grades: 0.1 mm, 0.05 mm, and 0.02 mm, with the finest models resolving differences smaller than the thickness of a human hair.
Digital calipers display the reading on a screen and eliminate the need to interpret tiny scale markings. They’re the go-to choice for woodworking, 3D printing, metalworking, and general shop use. For most DIY and light manufacturing tasks, a caliper with 0.02 mm accuracy is more than enough.
To get a good reading, close the jaws gently around the workpiece without forcing them. Overtightening skews the measurement, especially on soft materials like wood, rubber, or plastic. Take readings at multiple points, since real-world materials aren’t always uniform.
Micrometers for Higher Accuracy
When you need precision beyond what a caliper offers, a micrometer is the next step. It uses a finely machined screw mechanism: one full rotation advances the measuring spindle by 0.5 mm, and the thimble is divided into 50 graduations, so each click represents 0.01 mm. With an additional estimated reading, you can resolve down to 0.001 mm (one micron).
Micrometers are purpose-built. An outside micrometer measures the thickness of sheet metal, wire, or machined parts. Other types exist for inside diameters and depth, but the outside version is what most people mean when they talk about measuring thickness. If you’re checking whether a machined part meets a tight tolerance, or verifying the gauge of a wire, a micrometer is the standard tool.
The tradeoff is range. Most micrometers only cover a 25 mm span (0 to 25 mm, 25 to 50 mm, and so on), so you need to own the right size for your workpiece. Calipers typically cover 150 mm or more in a single tool.
Understanding Sheet Metal Gauge Numbers
In metalworking, thickness is often described by gauge number rather than inches or millimeters. This system is counterintuitive: higher gauge numbers mean thinner material. A few common reference points:
- Gauge 10: 0.134 inches (about 3.4 mm), a thick structural plate
- Gauge 14: 0.083 inches (about 2.1 mm), common for automotive body panels
- Gauge 18: 0.049 inches (about 1.2 mm), typical for ductwork and appliances
- Gauge 22: 0.028 inches (about 0.7 mm), used in light enclosures and trim
- Gauge 26: 0.018 inches (about 0.5 mm), thin flashing and decorative sheet
These numbers apply to standard steel gauge. Aluminum and other metals use different gauge systems with different decimal equivalents, so always confirm which standard you’re working with before ordering material.
Coating Thickness on Metal Surfaces
Measuring the thickness of paint, powder coat, plating, or anodizing on a metal surface requires a specialized electronic gauge. These work without damaging the coating, and two different technologies cover most situations.
On steel and iron (ferrous metals), gauges use magnetic induction. A probe generates a changing magnetic field, and the strength of that field changes depending on how far the probe sits from the steel substrate. The coating acts as a spacer, so thicker coatings produce weaker magnetic interaction. The gauge translates that difference into a thickness reading.
On aluminum, copper, brass, and other nonferrous metals, gauges use eddy currents instead. A coil in the probe generates a high-frequency alternating magnetic field (above 1 MHz) that induces small electrical currents on the metal surface. The coating thickness affects the strength of those currents, which the gauge detects and converts to a measurement. Many modern gauges automatically detect the substrate type and switch between the two methods.
Ultrasonic Gauges for Measuring Through One Side
When you can only access one side of a material, like a pipe wall, a tank, or a ship hull, an ultrasonic thickness gauge is the solution. It sends a pulse of high-frequency sound into the material from a small handheld probe. The sound travels through the material, bounces off the far wall, and returns. The gauge measures the round-trip travel time and calculates thickness using a simple formula: thickness equals the speed of sound in that material multiplied by half the transit time.
Most ultrasonic testing uses frequencies between 500 kHz and 20 MHz. Thin materials are measured at higher frequencies for better resolution, while thick or sound-absorbing materials like rubber, fiberglass, and composites need lower frequencies that penetrate deeper. You program the gauge with the known speed of sound for whatever material you’re testing, and it handles the math.
These gauges are standard equipment for inspecting corrosion in pipelines, monitoring wear on pressure vessels, and checking wall thickness in castings. They’re nondestructive, portable, and accurate to fractions of a millimeter.
Measuring Body Fat With Skinfold Calipers
Skinfold calipers measure the thickness of a pinched fold of skin and underlying fat at specific body sites. The readings plug into equations that estimate overall body fat percentage. The most common protocol uses three sites: the triceps (back of the upper arm), the suprailiac (just above the hip bone), and the thigh. A more detailed seven-site protocol adds the chest, abdomen, the area just below the shoulder blade, and the armpit.
Accuracy depends heavily on technique. All measurements should be taken on the right side of the body. Each site should be marked before measuring so it can be located consistently. Readings are taken three times at each site and averaged. Minimal pressure is applied so the skin fold isn’t artificially compressed. For the most reliable results, the same person should take all measurements, since different technicians can introduce variation even when following the same protocol.
Corneal Thickness in Eye Exams
Pachymetry measures the thickness of the cornea, the clear front surface of the eye. Normal human corneal thickness ranges from 420 to 625 microns, with an average of about 515 microns at the center. This measurement matters for two reasons.
First, corneal thickness directly affects the accuracy of eye pressure readings. If your cornea is thicker than about 540 microns, standard pressure tests tend to read artificially high, potentially triggering a false glaucoma scare. If your cornea is thinner than that, pressure readings come in deceptively low, which could mask real glaucoma. Second, pachymetry helps detect early signs of conditions like keratoconus (progressive thinning) and Fuchs dystrophy (progressive thickening). A cornea that reaches 700 microns, roughly a 40% increase, signals significant swelling.
Nanometer-Scale Films Using Light
At the smallest scales, in semiconductor manufacturing and optical coatings, thickness is measured using a technique called ellipsometry. It works by shining polarized light onto a surface and analyzing how the reflection changes the light’s polarization state. A thin transparent film on a reflective surface shifts the phase and amplitude of the reflected light in ways that depend on the film’s thickness and optical properties.
The process starts by measuring the bare substrate to establish a baseline. After a film is deposited, the reflection is measured again. The instrument compares two values: the phase shift between the two polarization components and their relative amplitude change. If the film’s optical properties are known, its thickness can be calculated from these measurements. Ellipsometry is accurate down to a fraction of a nanometer, making it essential for quality control in chip fabrication where layers may be only a few atoms thick.

