A lidar sensor measures distance by firing rapid pulses of laser light and timing how long each pulse takes to bounce back. Every return is converted into a precise distance measurement, and thousands of these measurements per second build a detailed three-dimensional map of the surrounding environment. The technology shows up in everything from self-driving cars to iPhones to archaeological surveys beneath dense jungle canopy.
How Lidar Measures Distance
Lidar stands for Light Detection and Ranging. The core principle is called time of flight: the sensor emits a short burst of laser light lasting just a few nanoseconds, that light travels outward until it hits a surface, and a fraction of the photons reflect back to a detector on the sensor. The system records the exact departure and arrival times, then applies a simple formula: distance equals the photon’s round-trip travel time divided by two, multiplied by the speed of light.
A scanning mechanism, usually a rotating or oscillating mirror, steers each pulse in a slightly different direction. This lets a single sensor sweep across a wide field of view, collecting distance measurements at hundreds of thousands of points per second. The result is a “point cloud,” a dense constellation of 3D coordinates that together form a precise digital model of whatever the sensor is pointed at.
Key Hardware Inside the Sensor
Every lidar unit contains a few essential components working together:
- Laser source. Generates the light pulses. Most terrestrial systems use near-infrared wavelengths invisible to the human eye. Bathymetric (underwater) lidar uses blue-green wavelengths that can penetrate water up to about 40 meters.
- Photodetector. Catches the returning photons and converts them into an electrical signal.
- Scanning mechanism. Directs each pulse outward across the scene, either with a spinning mirror or, in newer solid-state designs, with no moving parts at all.
- Timing electronics. Records departure and return times with nanosecond precision. A single outgoing pulse can produce multiple returns as it clips a tree branch, passes through, and then hits the ground, so the electronics need to log each return separately.
Airborne lidar systems add a GPS receiver and an inertial measurement unit so the sensor knows its own exact position and orientation while in flight. That information gets fused with the distance data to place every point in real-world coordinates.
Lidar in Self-Driving Cars
Autonomous vehicles rely on lidar to build a real-time 3D map of everything around the car: other vehicles, pedestrians, cyclists, curbs, lane markers, and overhanging signs. Cameras can capture color and detail but lack native depth perception. A large object far away can occupy the same number of pixels as a small object nearby, making distance estimates unreliable. Radar measures depth well but at far lower resolution, so it struggles to distinguish between a person standing near a pole and the pole itself. Lidar combines high resolution with direct distance measurement, giving the vehicle’s software a sharp, dimensionally accurate picture of its surroundings.
That point cloud feeds into tasks like obstacle detection and avoidance, navigation, and simultaneous localization and mapping (SLAM), where the car figures out both where it is and what’s around it at the same time. All of this processing happens within fractions of a second, fast enough for the car to brake or steer in response to a pedestrian stepping off a curb.
Mapping, Forestry, and Archaeology
Because lidar pulses can slip through gaps in leaves and branches, the technology has transformed how researchers study forested and overgrown landscapes. Aerial lidar fired from a drone or airplane records returns from the tree canopy, mid-level vegetation, and the bare ground beneath. Software strips away the vegetation layers to produce a “bare earth” terrain model that can reveal building foundations, ancient roadways, earthworks, and agricultural field systems that are completely invisible from the air or on foot.
Before lidar, surveying forested archaeological sites meant slow, difficult pedestrian searches where GPS equipment barely functioned under the canopy and traditional aerial photography showed nothing but treetops. Over the past decade, drone-mounted lidar has made it possible to scan large forested areas quickly and at steadily improving resolution, uncovering sites in places like Hawai’i, the American Southwest, and Central American jungles that had gone undetected for centuries.
In forestry, the same multi-return capability lets managers measure canopy height, estimate timber volume, and monitor growth over time without cutting a single tree.
Lidar on Your Smartphone
Apple introduced a small lidar scanner on the iPad Pro in 2020 and later added it to the iPhone Pro lineup. The sensor is far less powerful than an automotive or aerial unit, but it serves a few practical purposes. It improves camera autofocus in low light by giving the phone a direct distance measurement instead of relying on contrast detection. It also powers augmented reality features, letting apps place virtual furniture in a real room with accurate scale and positioning.
For more hands-on uses, scanning apps let you generate a quick floor plan of a room, measure spaces for renovation projects, or create rough 3D models of small objects and interiors. The technology is well suited for indoor navigation and quick building documentation, especially for users who need a serviceable result without professional surveying equipment.
Two Common Laser Wavelengths
Most lidar sensors operate at one of two infrared wavelengths: 905 nanometers or 1,550 nanometers. The choice involves tradeoffs in range, weather performance, and eye safety.
The 905 nm wavelength is cheaper to produce and holds up better in rain and fog. In heavy rain, a 1,550 nm system may lose up to half its maximum detection range compared to a 905 nm system under the same conditions. Fog is even more punishing: visibility of just 200 meters can cause signal loss roughly 500 times greater at 1,550 nm than at 905 nm. Wet surfaces also reflect about 30% less light back to a 1,550 nm sensor, versus around 10% less for 905 nm.
The advantage of 1,550 nm is eye safety. Wavelengths above 1,400 nm are absorbed by the front layers of the eye before reaching the retina. That means the sensor can fire at higher power levels without risking eye damage, which translates to longer range under clear conditions. Many automotive manufacturers favor 1,550 nm for this reason, accepting the weather penalty in exchange for the ability to safely push detection range further.
How Weather Affects Performance
Lidar’s main vulnerability is atmospheric interference. Rain, fog, and snow scatter or absorb the laser pulses, reducing both the maximum detection range and the accuracy of distance measurements. Fog degrades performance more than rain because a cubic meter of foggy air contains roughly 1,000 times more suspended droplets than a cubic meter of rainy air, creating far more surfaces for the light to scatter off of.
In practical terms, heavy rain (around 98 mm per hour) at a target 20 meters away introduces a distance error of about 5 centimeters. That sounds small, but it grows at longer ranges and higher rain rates. Fog with visibility under 50 meters can push signal loss above 9 decibels and significantly increase false detection rates, where the sensor mistakes rain or fog droplets for solid objects. These limitations are a key reason why self-driving car systems pair lidar with radar and cameras rather than relying on any single sensor alone.
Solid-State Lidar and Falling Costs
Early automotive lidar units used bulky spinning mechanisms that cost tens of thousands of dollars and wore out over time. The industry is now shifting toward solid-state designs with no moving parts, making the sensors smaller, more durable, and dramatically cheaper. The global solid-state lidar market hit $2.2 billion in 2025 and is projected to grow at roughly 20% per year through 2035.
The cost target driving this growth is getting a sensor below $200, with a longer-term goal of approaching $100. At that price point, lidar stops being a feature reserved for luxury or experimental vehicles and becomes standard equipment. Solid-state designs also hold up better in the vibration and temperature extremes of everyday driving, which is critical for sensors expected to last the life of a car.

