What Is the Probability of Nuclear War: Expert Estimates

No one can assign a single, precise probability to nuclear war, but expert estimates generally place the annual risk somewhere between 1 and 10 percent per year. That range sounds small in any given year, but it compounds over time in a way that makes the long-term outlook sobering. At even 1 percent per year, the cumulative probability reaches roughly 50 percent over a human lifetime. At the higher end, a nuclear exchange becomes statistically expected within a decade.

The risk isn’t theoretical. Nine nations currently possess approximately 12,121 nuclear warheads, with about 2,100 kept on high alert and ready to launch on short notice. That number actually increased by about 100 from the previous year. Understanding what drives this risk, how close we’ve already come, and what could trigger a future exchange helps put those probability estimates in context.

Why a Single Number Is Misleading

As Stanford engineer Martin Hellman has argued, quoting a probability of nuclear war without specifying a time frame is meaningless. Saying the risk is “low” obscures the math. If the annual probability is 1 percent, that sounds reassuring for any single year. But run that same 1 percent chance across 100 years, and you’d expect a nuclear war within that span. At 10 percent per year, the expected timeline shrinks to roughly a decade. In Hellman’s framing, over a long enough horizon, nuclear war at any meaningful annual probability becomes virtually certain unless the underlying risk is actively reduced.

This is the core tension: the probability in any given year may feel manageable, but humanity doesn’t get to play the odds once. We play them every year, forever, and we only need to lose once.

Where Expert Estimates Land

Formal probability estimates vary widely depending on who’s making them and what they’re measuring. Some researchers focus narrowly on the chance of an accidental launch from a false alarm. Others model the broader risk of deliberate escalation from a conventional conflict. Still others try to capture every pathway, from terrorism to miscalculation to cyberattack on early warning systems.

The estimates that appear most often in academic and policy literature cluster around 0.5 to 2 percent per year for any nuclear detonation in conflict, with full-scale strategic exchange (hundreds or thousands of warheads) being a smaller fraction of that. The Bulletin of the Atomic Scientists doesn’t publish a probability figure, but their Doomsday Clock serves as a rough barometer. It currently sits at 90 seconds to midnight, the closest it has ever been, where it has remained since 2023. The factors they cited in January 2024 include Russia’s war in Ukraine, Moscow’s suspension of participation in arms control agreements, the war in Gaza, record global temperatures, and the disruptive potential of AI.

How Close We’ve Already Come

The historical record reveals multiple incidents where nuclear war was averted by minutes, by a single person’s judgment call, or by sheer luck. These near-misses are not hypothetical scenarios. They happened.

In 1995, a U.S. scientific rocket launched off the coast of Norway to study the northern lights. Russian radars read it as a nuclear missile fired from an American submarine, capable of reaching Moscow in 15 minutes. President Boris Yeltsin’s advisors opened the nuclear briefcase and placed the launch button on his desk, telling him Russia was under attack. Yeltsin had ten minutes to decide whether to order a retaliatory strike. Two minutes before his decision deadline, a senior officer confirmed the missile posed no threat. It turned out the U.S. had notified Russia about the launch in advance, but the message never made it up the chain of command.

In June 1980, a defective computer chip worth less than a dollar caused U.S. early warning systems to report that 2,200 Soviet missiles were inbound. National Security Adviser Zbigniew Brzezinski was woken and told to prepare President Carter for a retaliatory strike. Nuclear bomber crews started their engines. Missile crews opened their launch safes. The error was caught only when radar sites failed to confirm the incoming attack.

That same year, a maintenance worker at a Titan II missile silo in Arkansas accidentally dropped a socket wrench. It fell 70 feet, punctured the missile’s fuel tank, and caused an explosion that flung a 9-megaton nuclear warhead into a ditch 200 yards away. It did not detonate. These incidents illustrate that risk doesn’t come only from deliberate decisions by national leaders. Mechanical failures, software glitches, and communication breakdowns create their own pathways to catastrophe.

Today’s Escalation Triggers

The current geopolitical landscape contains several specific flashpoints. Russia’s updated nuclear doctrine, described by Vladimir Putin in September 2024, expanded the conditions under which Moscow would consider nuclear use. Under the revised framework, Russia would treat aggression by any non-nuclear state that has the participation or support of a nuclear-armed state as a “joint attack” on Russia. This language could, in theory, reframe Western military aid to Ukraine as grounds for nuclear response.

The doctrine also lowered the threshold for what counts as an attack warranting nuclear retaliation. Russia would now consider using nuclear weapons upon receiving “reliable information about a massive launch of air and space attack weapons” crossing its border, a category that explicitly includes cruise missiles, drones, and hypersonic weapons, not just ballistic missiles. Moscow also extended its nuclear umbrella to Belarus, reserving the right to use nuclear weapons in response to any attack on the Russia-Belarus Union State that “creates a critical threat to sovereignty,” even if that attack uses only conventional weapons.

The United States, for its part, maintains a policy of “calculated ambiguity” about when it would use nuclear weapons. The 2022 Nuclear Posture Review and subsequent congressional commission endorsed a strategy of flexible response and tailored deterrence, deliberately leaving adversaries uncertain about which attacks might trigger a nuclear response.

How Conventional Wars Go Nuclear

Military planners don’t think of escalation as a simple on/off switch. Research from Lawrence Livermore National Laboratory describes it as a continuum, beginning with the first failure of deterrence when an adversary starts a conventional war and ending with the last failure, a full nuclear exchange. The critical insight from their analysis is that each side in a conflict fights at its own perceived level of escalation, not a shared one. What one country sees as a limited, proportional strike, the other may interpret as a dramatic escalation.

Russia’s military thinking reflects this complexity. Its approach to escalation management involves what analysts describe as “dosing and calibrating” damage to sober an adversary without provoking uncontrollable retaliation. Russia’s escalation model moves through three phases: conventional strikes on military targets, conventional destruction of broader military and civilian infrastructure, and finally the use of smaller tactical nuclear weapons against critical targets, potentially followed by strategic nuclear weapons. The underlying assumption is that Russia can control the pace and keep a war limited, including by demonstrating a willingness to risk nuclear war as a way of constraining Western responses.

Whether that assumption holds under real wartime conditions is one of the most dangerous open questions in global security. History suggests that carefully planned escalation strategies rarely survive contact with the chaos, miscommunication, and emotional pressure of actual combat.

AI and the Speed of Decision-Making

A newer category of risk involves the integration of artificial intelligence into nuclear command and early warning systems. The core danger is speed: AI could compress decision timelines that are already dangerously short. When a president has ten minutes to decide whether an incoming missile is real, adding a machine-learning system that confidently identifies the threat could either prevent a disaster or cause one, depending on whether the system is right.

Current AI systems have well-documented weaknesses that are especially dangerous in this context. They can be thrown off by slight deviations in data they weren’t trained on, a property researchers call “brittleness.” They can absorb the biases of the humans who trained them, causing them to misread signals from certain countries or scenarios. And they can produce outputs that look highly confident even when they’re wrong, giving decision-makers false certainty in moments when doubt might be the thing that prevents a launch.

The Arms Control Association has warned that rushing AI into nuclear systems without accounting for these limitations could revive old problems like false positives, the exact kind of error that nearly started a war in 1980. An unaligned AI system operating in a nuclear context represents what one analysis described as a “sorcerer’s apprentice” problem: a powerful, autonomous tool following instructions that turn out to be incomplete or imprecise, with no margin for error.

What the Numbers Actually Mean for You

The honest answer is that nobody knows the exact probability of nuclear war, and anyone who claims to is overstating their confidence. But the range that serious analysts work with, roughly 1 percent per year give or take, is not comforting once you understand cumulative risk. A 1 percent annual chance means roughly a 10 percent chance over any given decade and close to a coin flip over a lifetime. The structural factors that drive this risk (thousands of warheads on hair-trigger alert, expanding nuclear doctrines, eroding arms control agreements, new technologies compressing decision timelines) are currently moving in the wrong direction.

The Doomsday Clock’s position at 90 seconds to midnight reflects a judgment that the combination of active wars involving nuclear-armed states, collapsing diplomatic frameworks, and emerging technological risks has created the most dangerous environment since the clock was created in 1947. The risk is not that any leader wants nuclear war. It’s that the systems, doctrines, and technologies surrounding these weapons create enough opportunities for accident, miscalculation, and escalation that even rational actors can stumble into catastrophe.