An AV, or autonomous vehicle, is a car, truck, or shuttle that uses sensors, software, and computing power to drive itself with limited or no human input. These vehicles range from cars with basic driver-assist features like lane-keeping all the way to fully self-driving robotaxis already operating in cities across the U.S. and China. The technology sits on a spectrum, and where a vehicle falls on that spectrum determines how much (or how little) a human driver still needs to pay attention.
The Six Levels of Driving Automation
The automotive industry uses a standardized scale, SAE J3016, that classifies vehicles from Level 0 to Level 5 based on how much the car handles on its own versus how much falls on the human driver.
Level 0 is an ordinary car. The human does everything. The vehicle might have automatic emergency braking or standard cruise control, but these features either don’t respond to the driving environment or only intervene momentarily in emergencies.
Level 1 adds a single automated function that responds to road conditions, like adaptive cruise control that adjusts speed based on the car ahead, or lane-centering that keeps you in your lane. You still control the rest.
Level 2 combines two or more of those functions simultaneously. A car that handles both steering and speed in highway traffic is Level 2. This is where most “self-driving” features on consumer vehicles sit today. Despite feeling advanced, Level 2 requires your full attention at all times because the car can hand control back to you without warning.
Level 3 is a significant jump. The vehicle handles all driving tasks in specific conditions, and you can genuinely look away from the road. But you must be ready to take over when the system requests it, typically within a few seconds.
Level 4 vehicles drive themselves entirely within a defined area or set of conditions, like a robotaxi operating within certain city neighborhoods. No human intervention is needed in those scenarios. If the vehicle encounters something outside its operational boundaries, it pulls over and stops safely on its own rather than asking a human to take the wheel.
Level 5 is fully autonomous driving everywhere, in all conditions, with no steering wheel or pedals required. No production vehicle has reached this level.
How an AV “Sees” the Road
Autonomous vehicles rely on layers of sensors working together to build a real-time picture of everything around them. The three core technologies are lidar, cameras, and radar, each contributing something the others can’t.
Lidar (Light Detection and Ranging) fires rapid laser pulses that bounce off objects and return to the sensor. The result is a detailed 3D point cloud, essentially a spatial map of the vehicle’s surroundings accurate to within a few centimeters. This gives the car precise depth perception, which is critical for measuring exact distances to other vehicles, pedestrians, and obstacles.
Cameras capture rich visual data including color and texture. This is what allows the vehicle to read road signs, recognize traffic lights, interpret lane markings, and distinguish between types of objects. Cameras work much like human vision, making them essential for understanding context that a lidar point cloud alone can’t provide.
Radar uses radio waves to detect objects and measure their speed. It works reliably through rain, fog, and dust, giving it an advantage in conditions that degrade camera and lidar performance. Ultrasonic sensors handle close-range detection, like spotting a curb or object inches from the bumper during parking.
Most AV companies use a combination of all these sensors. The overlapping data creates redundancy: if one sensor type fails or struggles in a given situation, the others fill the gap.
The Software That Makes Decisions
Raw sensor data is useless without software to interpret it. The AV software stack typically follows a four-stage pipeline: perception, prediction, planning, and control.
Perception is where the vehicle figures out what’s around it. The software fuses data from all sensors to identify free drivable areas, the location of surrounding obstacles, and how fast those obstacles are moving. Prediction takes that information and forecasts what other road users will do next. Will that pedestrian step off the curb? Is the car in the next lane about to merge? Planning then maps a safe path forward, deciding when to accelerate, brake, change lanes, or yield. Control translates that plan into actual steering, throttle, and brake commands.
All four stages run continuously, many times per second, creating a loop where the vehicle constantly perceives, predicts, plans, and acts.
Communicating Beyond the Car’s Own Sensors
A technology called V2X, short for vehicle-to-everything, lets autonomous vehicles share real-time information with other cars, traffic lights, road infrastructure, and even pedestrian devices. This extends an AV’s awareness beyond what its own sensors can physically detect. A connected intersection, for instance, might alert an approaching AV about a cyclist hidden behind a building. V2X is especially valuable for detecting road users that are out of the vehicle’s direct line of sight, a scenario where onboard sensors alone would be blind.
What’s Limiting Full Autonomy
The biggest technical barrier to Level 5 driving is adverse weather. Rain, snow, fog, and dust degrade sensor performance across the board. Lidar pulses scatter in heavy rain. Cameras lose clarity. Even radar, which is relatively weather-resistant, can struggle with noise in severe conditions. This is a primary reason most commercial robotaxi services currently operate in cities with mild, dry climates.
Edge cases present another challenge. These are rare, unusual situations the software hasn’t been trained to handle, like a mattress falling off a truck, a traffic officer using nonstandard hand signals, or construction zones that change daily. Each edge case requires the system to make a judgment call it may never have encountered before, and the number of possible edge cases on open roads is essentially infinite.
Where Robotaxis Are Already Operating
Despite these challenges, commercial driverless ride-hailing services are live in multiple countries. In the United States, Waymo operates robotaxis in Phoenix, San Francisco, Los Angeles, Austin, and Atlanta. Tesla has launched a service in Austin as well.
China has the broadest deployment. Baidu’s Apollo platform runs robotaxis across ten cities including Beijing, Shanghai, Guangzhou, and Shenzhen. Several other Chinese companies, including Pony.ai, AutoX, and WeRide, operate in overlapping cities. WeRide has also expanded to Abu Dhabi in the UAE, and South Korea’s SW Mobility runs a service in Seoul.
These services typically operate within geofenced zones, specific neighborhoods or districts where the roads have been mapped in fine detail and the conditions are well understood. Expansion happens city by city, zone by zone, as companies validate performance in each new area.
Why It Matters
The core safety case for autonomous vehicles rests on a striking number: 94% of serious crashes in the U.S. are caused by human error, according to NHTSA’s National Motor Vehicle Crash Causation Survey. Distraction, impairment, fatigue, and poor decisions account for the vast majority of traffic deaths. AVs don’t get drunk, tired, or distracted, which is why regulators view the technology as a potential path to dramatically fewer road fatalities.
The economic stakes are enormous. The global autonomous vehicle market is projected to exceed $2.3 trillion by 2030, up from roughly $565 billion expected in 2026. That growth is driven not just by passenger robotaxis but also by autonomous trucking, delivery vehicles, and industrial applications.
How the U.S. Regulates Autonomous Vehicles
The U.S. federal framework for AVs has evolved gradually. NHTSA issued its first Federal Automated Vehicles Policy in 2016 and followed it with “A Vision for Safety,” which set expectations for companies developing the technology. In 2020, the agency launched AV TEST, a voluntary initiative where companies and states can submit testing data that the public can view through an interactive tool.
In 2025, the U.S. Department of Transportation unveiled an updated framework that includes amended reporting requirements for automated driving systems and a domestic exemption program allowing limited deployment of vehicles that don’t meet traditional safety standards designed for human-driven cars. Companies building AVs are still required to comply with federal motor vehicle safety standards and certify that their vehicles are free of safety risks. Most day-to-day regulation of AV testing and deployment, including where robotaxis can operate and under what conditions, happens at the state level.

