A depth gauge measures how deep a hole, slot, or recess is by referencing a flat surface at the top. Every type works on the same basic principle: a flat base sits on the reference surface while a rod, probe, or needle extends into the feature you’re measuring. The reading tells you the distance between those two points. How you actually take that reading depends on which type of depth gauge you’re using.
Parts Common to All Depth Gauges
Regardless of type, every depth gauge has two essential parts: a flat reference base and a measuring element. The base (sometimes called the bridge) rests on the top surface of your workpiece, creating a stable zero-reference plane. The measuring rod, probe, or beam extends downward into the hole or groove you’re measuring. Most mechanical gauges also have a locking screw that lets you freeze the measurement in place so you can pull the tool out and read it without the value shifting.
How to Read a Vernier Depth Gauge
A vernier depth gauge has two scales: a main scale on the beam (graduated in millimeters or inches) and a smaller vernier scale on the slider. Reading it is a three-step process.
First, look at the main scale and find the last whole division that the zero line of the vernier scale has passed. That gives you your whole-number measurement. For example, if the vernier zero sits just past the 12 mm mark, your base reading is 12 mm.
Next, look along the vernier scale and find the line that aligns most perfectly with any line on the main scale. The number on that vernier line is your decimal addition. If line 3 on the vernier matches up, you add 0.03 mm (on a gauge with 0.01 mm resolution) or 0.30 mm (on one with 0.02 mm divisions), depending on the gauge’s graduation. Your total reading would be 12.03 mm or 12.30 mm respectively.
The key to accuracy here is viewing the scales straight on. If you look at the scale from any angle other than directly perpendicular, you introduce parallax error, which means the apparent position of the vernier line shifts relative to the main scale. Always position your eye directly above and square to the reading point.
How to Read a Dial Depth Gauge
Dial depth gauges replace the vernier scale with a dial indicator, making them faster to read. The probe moves a needle around a circular face, typically graduated in 0.01 mm or 0.001-inch increments.
When the probe pushes inward (deeper into a feature), the needle moves clockwise and gives a positive reading. When the probe extends outward, the needle travels counterclockwise and gives a negative reading. Most dial depth gauges also have a small revolution counter on the face that tracks full rotations of the main needle, so you don’t lose count when measuring deeper features.
To take a reading, combine the revolution counter value (each full revolution typically represents 1 mm or 0.100 inches) with the position of the main needle. If the counter shows 5 and the needle points to 37 on a metric gauge with 0.01 mm graduations, your reading is 5.37 mm.
How to Read a Digital Depth Gauge
Digital depth gauges display the measurement directly on an LCD screen, eliminating any interpretation of scales. You place the base on your reference surface, extend the probe into the feature, and read the number. Most digital models include buttons to switch between millimeters and inches, zero the display at any point, and hold a reading on screen. Some advanced models can output data directly to a computer or quality-control system.
The main thing to get right with a digital gauge is zeroing. Before measuring, place the base on a known flat surface (ideally a precision surface plate) and press the zero button. This establishes your reference point. If you skip this step or zero on an uneven surface, every measurement after that will carry the error forward.
Zeroing Any Depth Gauge
Zeroing is the single most important step for accurate depth readings, regardless of gauge type. The standard procedure is to place the gauge base on a clean, flat surface, such as a surface plate or the face of a gauge block, and confirm the reading shows zero. On a vernier gauge, the zero lines of both scales should align. On a dial gauge, the needle should point to zero. On a digital gauge, press the zero button.
For depth micrometers (a precision variant), the process is more involved. After cleaning the measuring surfaces, you set the base on a surface plate and apply consistent measuring force using the ratchet. The zero line on the thimble should align with the index line on the sleeve. If the error is small (less than 0.02 mm), you can correct it by rotating the sleeve with a spanner key. Larger errors require loosening the thimble lock nut and manually realigning the zero marks before retightening. You should check zero every time you change the measuring range or swap rods.
Reading a Scuba Diving Depth Gauge
Analog scuba depth gauges work on water pressure rather than physical contact, but the reading principle is straightforward. These gauges typically have scales up to 60 meters (200 feet) and use two needles. The primary needle (usually black with a red tip) shows your current depth in real time. As you descend, it pushes a thinner secondary needle along with it. When you ascend, the primary needle drops back toward zero, but the secondary needle stays at the deepest point you reached during the dive.
This max-depth needle is your record of the dive’s deepest point, which matters for decompression planning. To reset it before a new dive, turn the slotted knob in the center of the gauge face counterclockwise until the thin needle meets the primary needle.
Accuracy varies with depth. In the shallower range (0 to 30 meters), expect readings accurate to within 0.5 meters or 2 feet. Below 30 meters, accuracy loosens to about 1.75 meters or 3 feet. This is normal and worth keeping in mind if you’re approaching no-decompression limits.
Reading a Tire Tread Depth Gauge
Tire tread depth gauges are the simplest type to read. In the United States, tread depth is measured in 32nds of an inch, and most gauges display both 32nds and millimeters. You push the probe into a tread groove, press the base flat against the tread surface, and read the number on the scale or display.
New tires typically start at 10/32″ to 11/32″ of tread depth. The U.S. Department of Transportation recommends replacing tires at 2/32″, and many states legally require it. At 2/32″, you’ll notice the tire’s built-in wear bars (small raised bridges between the tread ribs) are now flush with the tread surface, which is a visual confirmation of the gauge reading. For wet-weather safety, 4/32″ is the practical threshold where stopping distance starts to increase significantly.
When checking, measure in multiple spots across each tire. Tread wears unevenly depending on alignment, inflation, and driving habits, so a single reading can be misleading. The shallowest reading is the one that counts.
Common Mistakes That Affect Accuracy
The most frequent error across all gauge types is a dirty or damaged reference surface. Any debris, burr, or nick on the base or the workpiece surface adds directly to the measurement. Wipe both surfaces clean before every reading.
Parallax error is the second most common problem with analog gauges. Reading the scale from an angle makes the indicator appear to point at a different graduation than it actually does. Always read the scale with your line of sight perpendicular to the face.
Inconsistent pressure matters too, especially with dial and micrometer-type gauges. Pressing too hard can flex the probe or compress the workpiece slightly, giving a reading that’s deeper than reality. Use the ratchet stop if your gauge has one, or develop a light, consistent touch. If you get two different readings on the same feature, take a third and use the value that repeats.

