Thermal sensors detect temperature by converting heat into an electrical signal, typically a change in voltage or resistance. Every type works on the same basic principle: a physical property of some material shifts predictably as it gets hotter or colder, and electronics translate that shift into a temperature reading. The differences between sensor types come down to which physical property they exploit and whether they need to touch the object being measured.
Contact Sensors: Thermistors and Thermocouples
The simplest thermal sensors are ones that physically touch whatever they’re measuring. These fall into two main categories.
Thermistors are small semiconductor devices whose electrical resistance changes with temperature. The most common type, called an NTC (negative temperature coefficient) thermistor, drops in resistance as it heats up. A circuit sends a tiny current through the thermistor and measures the resulting voltage. Since the resistance-temperature relationship is well-characterized, a microprocessor can convert that voltage into a precise temperature reading. Thermistors are what you’ll find in digital meat thermometers, medical thermometers, car engine sensors, and HVAC systems. They’re accurate within a narrow range, typically covering temperatures from around negative 50°C to about 300°C.
Thermocouples work on an entirely different principle discovered by Thomas Seebeck in 1821. When two wires made of different metals are joined at one end and that junction is heated, electrons flow from the hot side toward the cold side, generating a small voltage. The hotter the junction gets relative to the other end, the higher the voltage. This is called the Seebeck effect. Thermocouples are rugged and can handle extreme temperatures, which is why they’re the sensor of choice in industrial furnaces, jet engines, and kilns. Some types work reliably above 1,500°C, far beyond what a thermistor can handle.
Non-Contact Sensors: Reading Infrared Radiation
Every object above absolute zero emits infrared radiation. The warmer the object, the more radiation it gives off. Non-contact thermal sensors detect this radiation from a distance and calculate the object’s surface temperature without ever touching it.
The handheld infrared thermometer you might point at a forehead or a hot pipe uses this approach. A lens focuses incoming infrared energy onto a small detector element, often a thermopile, which is a stack of tiny thermocouples wired together. The infrared radiation heats one side of these thermocouples while the other side stays near ambient temperature, generating a voltage proportional to the incoming heat. Electronics convert that voltage into a temperature displayed on screen.
One important detail with these sensors is the distance-to-spot ratio. An IR thermometer measures the average temperature across a circular area, not a single point. A sensor with a 12:1 ratio measures a one-inch spot from 12 inches away. Move farther back, and the spot grows larger, blending the target’s temperature with its surroundings and reducing accuracy. Higher-end models with a 50:1 ratio can measure accurately from over four feet away, which matters when you’re scanning electrical panels or equipment you can’t safely approach.
Thermal Imaging: Thousands of Sensors in a Grid
A thermal camera is essentially thousands of non-contact sensors arranged in a grid, each one reading a tiny patch of the scene. The dominant technology in modern thermal cameras is the microbolometer. Each pixel-sized microbolometer is a thin film of material suspended on a tiny bridge structure. When infrared radiation hits the film, it heats up slightly, and its electrical resistance changes. Readout electronics measure the resistance of every pixel many times per second, building a complete thermal image of the scene.
Thermal cameras come in two broad categories. Uncooled cameras use microbolometer arrays that operate at room temperature. They’re compact, affordable, and require no special cooling, which is why they dominate firefighting, building inspection, and security applications. Modern uncooled sensors can detect temperature differences as small as 50 millikelvin (0.05°C), with resolutions reaching 1280 × 1024 pixels. That’s sensitive enough to spot a fingerprint’s residual heat on a wall or detect moisture hiding behind drywall.
Cooled cameras use a different detector material chilled to cryogenic temperatures (often below negative 190°C) by a small internal cooler. The cooling dramatically reduces electronic noise, giving these cameras even finer sensitivity. They’re used in military targeting systems, satellite imaging, and scientific research where picking out subtle temperature differences at long range is critical. The tradeoff is cost, size, and the fact that the cooler eventually wears out.
How Sensors Correct for the Environment
Raw sensor readings aren’t the final answer. Every thermal sensor has to account for environmental factors that can skew results.
The biggest factor for infrared sensors is emissivity, a measure of how efficiently an object radiates heat compared to a theoretically perfect emitter. Human skin has an emissivity of about 0.98, meaning it radiates 98% of the maximum possible infrared energy for its temperature. That’s why forehead thermometers work so well on people. But shiny metal surfaces can have emissivities below 0.1, reflecting surrounding radiation instead of emitting their own. Point an IR thermometer at a polished stainless steel pipe without adjusting the emissivity setting, and you’ll get a wildly inaccurate reading.
Ambient temperature also matters. If the sensor itself heats up on a summer day, its baseline drifts. Sensors handle this through two main strategies: hardware-based compensation using internal reference temperature sensors and Wheatstone bridge circuits that cancel out the drift electrically, or software-based compensation where a microprocessor applies correction algorithms using data from an onboard ambient temperature sensor. Most consumer and industrial sensors use a combination of both.
Passive Infrared Motion Detectors
Not all thermal sensors measure temperature. Passive infrared (PIR) sensors, the kind in home security systems and automatic lights, detect changes in infrared radiation rather than measuring its absolute level. A PIR sensor has a small thermopile element behind a segmented lens. When a warm body like a person walks through the sensor’s field of view, the infrared pattern across the lens segments shifts rapidly. The sensor registers this shift as a voltage spike and triggers an alarm or switches on a light.
PIR sensors don’t know the actual temperature of what they’re detecting. They only recognize that something warmer (or cooler) than the background has moved across their field of view. This is why they can be fooled by very slow movement or by objects that happen to match the ambient temperature closely.
Choosing the Right Sensor Type
- Thermistors are best for precise, low-cost temperature measurement in a defined range, like monitoring electronics, room temperature, or body temperature.
- Thermocouples handle extreme heat and harsh environments where other sensors would fail.
- Infrared thermometers work when you need a quick surface reading without touching the target, from cooking to industrial maintenance.
- Thermal cameras map temperature across an entire scene, useful for finding heat leaks in buildings, locating people in smoke, or inspecting electrical systems for hot spots.
- PIR sensors detect movement cheaply and reliably for security and automation, though they don’t measure temperature at all.
The core physics behind all of these is remarkably simple: heat changes a material’s properties in a predictable way, and electronics measure that change. What separates a two-dollar thermistor from a fifty-thousand-dollar cooled thermal camera is the sensitivity, speed, and sophistication of how that basic measurement gets made.

