Surface temperature is measured using either contact methods, where a sensor physically touches the surface, or non-contact methods, where an infrared sensor reads the heat energy a surface emits. The right approach depends on what you’re measuring, how hot it is, and how precise you need to be. Each method has tradeoffs in cost, speed, and accuracy, and small details like the shininess of a surface or your distance from it can throw off a reading by several degrees.
Non-Contact Measurement With Infrared
Infrared thermometers (often called “temperature guns”) are the most common tool for quick surface readings. When you aim one at a surface, a lens and sensor collect the thermal radiation that surface naturally emits. The device converts that energy into a temperature reading. This happens almost instantly, making infrared thermometers ideal for situations where you can’t or don’t want to touch the surface, whether it’s a hot engine block, a stovetop, or a wall you’re checking for insulation gaps.
Every infrared thermometer has a distance-to-spot ratio that determines how large an area it actually measures. A typical consumer model has an 8:1 ratio, meaning at 8 feet away, it reads an area about 1 foot across. A 12:1 model measures a 1-centimeter spot from 12 centimeters away. The displayed number is an average of all the temperatures within that circle. If you’re trying to find a small hot spot on a circuit board or pipe fitting, you need to either get closer or use a thermometer with a higher ratio. Standing too far back blends the target’s temperature with the cooler area around it.
Why Emissivity Matters
The biggest source of error with infrared thermometers is a property called emissivity, which describes how effectively a surface radiates heat. Emissivity is rated on a scale from 0 to 1. Most infrared thermometers ship calibrated to 0.95, which works well for organic materials, painted surfaces, and skin (human skin has an emissivity of about 0.98). Rough red brick comes in at 0.93. These surfaces all read accurately right out of the box.
Shiny, metallic surfaces are a different story. Highly polished aluminum has an emissivity as low as 0.04 to 0.06. That’s roughly 30 times lower than what the thermometer assumes. Point an infrared thermometer at aluminum foil and you’ll get a wildly inaccurate number, because most of the energy reaching the sensor is reflected from surrounding objects (walls, ceiling, your own body heat) rather than emitted by the foil itself. Students learning to use these tools commonly get confused when measuring shiny cans or metal surfaces for exactly this reason.
If your thermometer has an adjustable emissivity setting, you can dial in the correct value for the material you’re measuring. Reference tables list emissivity values for hundreds of materials. If your thermometer doesn’t have that setting, or you’re unsure of the exact value, there’s a reliable workaround.
The Tape Trick for Shiny Surfaces
The simplest fix for measuring reflective surfaces is to apply a small piece of high-quality black electrical tape to the target, let it reach the same temperature as the surface, then measure the tape instead. Most electrical tape has an emissivity of about 0.95, which matches the thermometer’s default calibration. Scotch Brand 88 black vinyl tape, for example, has an emissivity of 0.96 across both common infrared wavelength ranges and is a go-to recommendation.
For larger areas, flat (matte) paint works well. Most non-metallic paints have emissivities between 0.90 and 0.95. The flatness of the paint matters more than its color. Glossy and metallic paints should be avoided because they lower emissivity. Apply two coats to ensure the paint is opaque to infrared energy. For situations where you need a temporary coating on a larger surface, spray-on powders like dye penetrant developer also fall in the 0.90 to 0.95 range when applied thickly enough.
One important safety note: if you’re measuring electrically energized equipment, only apply tape or coatings when the equipment is powered off, and use only approved materials to avoid interfering with normal operation.
Thermal Imaging Cameras
A single-point infrared thermometer gives you one number at a time. A thermal imaging camera gives you a complete heat map. Even a mid-range model like the FLIR E8 Pro produces 76,800 simultaneous temperature readings (one per pixel in its 320 × 240 image). High-end industrial cameras reach over 786,000 readings per image.
This makes thermal cameras far better at finding problems you didn’t know to look for. A hot spot on an electrical panel, an area of missing insulation, or an overheating bearing all show up as bright patches in the image. With a single-point thermometer, you’d have to aim at every component individually and could easily miss a small trouble area. Thermal cameras also work at longer distances with better accuracy, and close-up lenses can resolve details as small as 5 micrometers per pixel for inspecting tiny electronic components.
The tradeoff is cost. Single-point infrared thermometers run from $15 to $100 for consumer models. Thermal cameras start around $300 for basic models and climb into the tens of thousands for professional-grade units.
Contact Methods: Thermocouples and Probes
When you need high accuracy or are working with surfaces that defeat infrared methods (very shiny metals, extremely small targets, or environments with heavy steam or smoke), contact measurement is the better choice. A thermocouple probe pressed against a surface measures temperature through direct thermal conduction.
Several probe designs exist for different situations. Flat-leaf probes are the standard choice for measuring flat surfaces quickly. They use a thin, wide sensing element that maximizes contact area. Cement-on and self-adhesive thermocouples attach directly to a surface for continuous monitoring over time, which is useful in manufacturing or testing scenarios. Heavy-duty surface probes with handles are designed for high-temperature industrial work like foundry applications. Right-angle probes reach into tight spaces where a straight probe won’t fit.
The key to accurate contact measurement is ensuring the sensor reaches the same temperature as the surface. Thermistors, which are extremely sensitive temperature sensors, can have response times as short as a few milliseconds because they’re so small (some under 0.2 mm). Larger thermocouple probes take longer. In practice, holding the probe firmly against the surface for several seconds gives a stable reading. Poor contact pressure, or lifting the probe too soon, will give you a number somewhere between the surface temperature and the air temperature.
Environmental Factors That Affect Accuracy
No matter which tool you use, the environment around you introduces error. Wind is a significant factor. Moving air cools both the surface you’re measuring and the sensor itself, and research on miniaturized thermal cameras has shown that wind and solar radiation can shift readings by several degrees Celsius. For outdoor measurements or in drafty industrial settings, expect reduced accuracy and take multiple readings.
Ambient temperature also matters. There’s a direct, linear relationship between the surrounding air temperature and the reading your infrared sensor produces. A sensor that was calibrated in a 20°C lab will drift when used in a 40°C factory or a freezing outdoor environment. Higher-end instruments compensate for this internally, but budget models may not. Letting the thermometer acclimate to the environment for a few minutes before measuring helps.
Distance plays a role beyond just the spot size. As you move farther from a target, the air between you and the surface absorbs and emits its own infrared energy, especially in humid or dusty conditions. For the most accurate readings, measure from as close as practical while still keeping the target fully within the measurement spot.
Choosing the Right Method
- Quick checks on everyday surfaces (walls, pipes, food, HVAC ducts): a basic infrared thermometer with an 8:1 or 12:1 spot ratio is sufficient and costs under $50.
- Shiny or metallic surfaces: use a contact thermocouple probe, or apply electrical tape and measure with an infrared thermometer set to 0.95 emissivity.
- Scanning large areas for hot or cold spots (electrical panels, building envelopes, mechanical equipment): a thermal imaging camera saves significant time and catches problems a single-point reading would miss.
- Continuous monitoring over hours or days (manufacturing processes, lab experiments): adhesive or cement-on thermocouples provide a fixed, ongoing reading without requiring someone to hold a probe.
- Very small targets (electronic components, micro-scale research): thermal cameras with close-up lenses resolve down to 5 micrometers per pixel, far beyond what a handheld infrared thermometer can do.
Surface temperature is not the same as internal temperature. This distinction matters in medical contexts, where skin surface readings can differ from core body temperature by half a degree or more depending on the method, and in cooking, where the outside of food can be much hotter or cooler than the center. Infrared and contact probes measure what’s happening at the surface only. If you need to know what’s going on inside, you need a penetration probe or a different approach entirely.

