Pilot error is a broad term for any mistake made by a flight crew member that contributes to an aviation accident or incident. It accounts for roughly 60 to 80 percent of all aviation accidents, making it the single largest factor in air safety. But the term is somewhat misleading. Modern aviation safety experts have largely moved away from blaming individual pilots, recognizing that most “pilot errors” result from a chain of contributing factors: fatigue, poor communication, confusing cockpit automation, organizational pressure, and cognitive biases that every human brain is vulnerable to.
How Aviation Defines Human Error
The FAA defines human factors as a “multidisciplinary effort to generate and compile information about human capabilities and limitations and apply that information to equipment, systems, facilities, procedures, jobs, environments, training, staffing, and personnel management.” In practice, this means the industry treats pilot error not as a personal failing but as a systems problem. When a crew makes a fatal mistake, investigators look at why the mistake was possible in the first place.
The numbers vary depending on the type of flying. A detailed FAA analysis of commercial aviation accidents from 1990 to 2002 found that about 45 percent of air carrier accidents (the large airlines you’d typically fly on) involved aircrew or supervisory error, while 75 percent of commuter aviation accidents did. General aviation, which covers private pilots and small aircraft, tends to have even higher rates of human-factor involvement. The gap reflects differences in training standards, cockpit technology, and organizational oversight.
The Swiss Cheese Model
One of the most influential ideas in accident investigation is the Swiss Cheese Model, developed by psychologist James Reason. It describes safety as a series of defensive layers, like slices of Swiss cheese. Each layer has holes representing weaknesses: a tired pilot, a confusing instrument display, a supervisor who approved a risky schedule. Most of the time, the holes don’t line up. A fatigued pilot catches their mistake because a copilot speaks up, or a warning system fires at the right moment. An accident happens when holes across multiple layers align simultaneously, creating a clear path from hazard to disaster.
Reason distinguished between “active failures,” which are the visible mistakes a pilot makes in the cockpit, and “latent failures,” which are hidden problems higher up in the system. A single latent failure, like an airline’s culture of discouraging junior officers from questioning captains, can create multiple opportunities for active failures down the line. This is why modern investigators almost never point to one person’s mistake as the sole cause. They trace the chain backward through supervision, organizational culture, and system design.
Cognitive Biases That Affect Pilots
Pilots are trained professionals, but they’re still subject to the same mental shortcuts and blind spots as everyone else. Several well-documented cognitive traps contribute to cockpit errors.
Plan continuation bias is one of the most dangerous. Often called “get-there-itis,” it’s the compulsion to press on with the original flight plan even when conditions have clearly changed. The FAA defines it as a fixation on the original goal or destination combined with a total disregard for any alternative course of action. A pilot flying into worsening weather who keeps thinking “it’ll clear up ahead” instead of diverting is experiencing this bias. It’s especially common when there’s personal or professional pressure to arrive on time.
Complacency sets in when a pilot has flown the same route or aircraft hundreds of times without incident. The routine itself becomes a risk factor, because the pilot stops actively monitoring for problems. This is closely related to a lack of situational awareness, where a crew loses track of where they are, what the aircraft is doing, or what conditions lie ahead.
The Dirty Dozen: Common Human Factors
The FAA and aviation industry use a framework called the “Dirty Dozen” to categorize the most common human factors behind errors. These apply to pilots, mechanics, and anyone else in the operation:
- Fatigue: reduced alertness from insufficient sleep or long duty periods
- Stress: both personal life stress and cockpit workload pressure
- Complacency: overconfidence bred by routine
- Distractions: anything that pulls attention from critical tasks
- Lack of communication: unclear or missing information between crew members or with air traffic control
- Lack of teamwork: failure to coordinate effectively in the cockpit
- Lack of assertiveness: a junior crew member not speaking up when they see a problem
- Lack of awareness: losing the big picture of the flight situation
- Lack of knowledge: gaps in training or unfamiliarity with aircraft systems
- Pressure: schedule demands, passenger expectations, or company culture pushing crews to take risks
- Norms: “we’ve always done it this way” shortcuts that bypass proper procedures
- Lack of resources: missing tools, outdated manuals, or insufficient support
Most accidents involve several of these factors interacting at once. A fatigued pilot under schedule pressure, flying with a copilot too intimidated to speak up, represents three or four holes in the Swiss cheese lining up simultaneously.
Automation Confusion
Modern cockpits are heavily automated, and that automation has dramatically reduced accident rates overall. But it has introduced a new category of error: automation surprise. This occurs when an automated system behaves differently than the pilot expects. Complex flight management systems operate in multiple modes (cruise, descent, landing approach, and so on), and their behavior changes significantly across those modes. “Mode confusion” happens when the system is in a different mode than the pilot assumes, so the pilot’s mental picture of what the aircraft is doing no longer matches reality.
The indications of a mode change can be subtle. In one well-studied scenario involving autopilot altitude capture, the only clue that the system had entered a vulnerable state was a small display window changing from “ALT” to blank. If the pilot then adjusted the pitch mode without realizing the altitude capture had disarmed, the aircraft could climb without limit. As researchers have noted, operators have limited memory and attention spans and should not be expected to track the internal state of complex systems perfectly. The solution lies in better interface design, not just better pilot attention.
Air France Flight 447, which crashed into the Atlantic in 2009, is one of the most studied examples of automation-related error. When the aircraft’s speed sensors froze at high altitude, the autopilot disconnected and handed control back to the pilots during a moment of confusion. Misleading stall warnings and breakdowns in crew communication contributed to the crew’s inability to recover. The tragedy illustrated how quickly a situation can deteriorate when automation fails and the crew isn’t prepared for manual control under stress.
How the Industry Prevents Pilot Error
The primary training tool for reducing pilot error is Crew Resource Management, or CRM, which the FAA has required for airline operations since the 1990s. CRM focuses on five core areas: situational awareness, communication skills, teamwork, task allocation, and decision-making. It trains crews to brief each other thoroughly before flights, speak up when they see a problem (even if it means challenging a senior captain), debrief after flights to identify what went well and what didn’t, and resolve conflicts constructively rather than letting disagreements fester or go unspoken.
One of CRM’s most important contributions is normalizing assertiveness. Before CRM became standard, aviation had a long history of accidents caused by copilots who noticed something was wrong but stayed silent out of deference to the captain. CRM explicitly trains crew members in “inquiry, advocacy, and assertion,” giving them frameworks and cultural permission to push back when safety is at stake.
Fatigue Rules
Fatigue is so well established as an error factor that it has its own regulatory framework. Under current FAA rules, airline pilots cannot exceed 100 flight hours in any 28-day period or 1,000 hours in a calendar year. Flight duty periods are capped at 60 hours in any seven consecutive days. Before any duty period, pilots must receive at least 10 consecutive hours of rest, with a minimum of 8 uninterrupted hours of sleep opportunity. For longer flights requiring augmented crews, total flight time caps are 13 hours with three pilots and 17 hours with four.
Better Cockpit Recording
Investigators rely heavily on cockpit voice recorders to understand what a crew was thinking, saying, and responding to before an accident. These recordings reveal crew coordination, workload levels, fatigue indicators, and how pilots reacted to system alerts. Following a 2024 investigation at John F. Kennedy International Airport, the NTSB pushed for all aircraft carrying both a voice recorder and a flight data recorder to be retrofitted with units capable of storing 25 hours of audio, up from the previous two-hour standard. The FAA finalized a rule in early 2026 requiring this extended capacity on newly manufactured aircraft, though the retrofit requirement for existing planes remains an open recommendation.

