Why Have No Fusion Reactors Been Developed and Built?

Fusion reactors haven’t been built for commercial power because the engineering required to contain, sustain, and extract energy from a fusion reaction is extraordinarily difficult, and several critical problems remain unsolved simultaneously. Scientists can produce fusion reactions in laboratories. They’ve done so since the 1950s. The challenge is doing it in a way that produces more energy than it consumes, runs continuously, doesn’t destroy the reactor from the inside out, and costs less than the alternatives. No facility on Earth has yet checked all those boxes at once.

Plasma Is Incredibly Hard to Control

Fusion requires heating hydrogen fuel to over 100 million degrees Celsius, far hotter than the core of the sun. At these temperatures, matter becomes plasma, a superheated gas of charged particles that must be suspended in powerful magnetic fields because no physical container can withstand direct contact with it. The most common reactor design, the tokamak, uses donut-shaped magnetic fields to hold this plasma in place.

The core problem is that plasma is inherently unstable. It constantly tries to escape confinement, developing ripples, bulges, and sudden collapses called disruptions. One well-known type, tearing instabilities, can cause a rapid and total loss of plasma confinement in milliseconds. Current control systems can suppress these instabilities after they form, but they aren’t fast enough to predict and prevent them in real time. Researchers at the DIII-D National Fusion Facility have recently developed AI-based controllers that can maintain stability during complex conditions, but this technology is still being refined and hasn’t been proven in a full-scale power plant environment.

Neutrons Destroy the Reactor Walls

The fusion reaction between deuterium and tritium (the most achievable fusion fuel combination) produces neutrons with 14.1 million electron volts of energy. These neutrons slam into the reactor’s inner walls at extreme speeds, knocking atoms out of position in the structural material. Over time, this bombardment degrades the walls, making them brittle, swollen, and more likely to trap hydrogen isotopes rather than let them pass through safely. The material literally falls apart from the inside.

No existing material can withstand years of this punishment at the intensities a commercial reactor would require. Making matters worse, researchers can’t even fully test candidate materials under realistic conditions because no current facility produces neutron fluxes equivalent to what a working fusion plant would generate. Much of the testing relies on simulations using heavy ion beams as stand-ins for actual fusion neutrons, which means there’s still significant uncertainty about how materials will perform in a real reactor.

The Fuel Doesn’t Exist in Useful Quantities

Deuterium, one half of the fuel mix, is easily extracted from seawater. Tritium, the other half, is a different story. It’s radioactive, with a half-life of about 12 years, which means it decays too quickly to stockpile in large amounts. As of 2024, the entire global civilian supply of tritium is roughly 25 kilograms, most of it produced as a byproduct of Canadian fission reactors.

A single commercial fusion plant would burn through several kilograms of tritium per year. The leading solution is to have the reactor breed its own tritium by surrounding it with lithium blankets that absorb the escaping neutrons and produce new tritium in the process. The catch: this breeding technology hasn’t been demonstrated at scale, and you need an operational fusion reactor to do the breeding. It’s a chicken-and-egg problem that adds significant technical risk to any reactor design.

Recent Breakthroughs Are Real but Narrow

The National Ignition Facility at Lawrence Livermore achieved a genuine milestone in December 2022, producing 3.15 megajoules of fusion energy from 2.05 megajoules of laser energy delivered to the target. By April 2025, that record had jumped to 8.6 megajoules of output from 2.08 megajoules of laser input, a target gain of 4.13. These results prove that fusion ignition works.

But these experiments use inertial confinement, where powerful lasers crush a tiny fuel pellet to trigger fusion. Each shot consumes a single pellet, and the lasers themselves require vastly more energy from the electrical grid than they deliver to the target. The 2.08 megajoules of laser light hitting the pellet required roughly 300 megajoules of electricity to generate. So while the fusion reaction itself produced net energy, the overall system consumed far more than it returned. And each shot is a one-off event, not a continuous power source.

ITER Is Decades Behind Schedule

ITER, the massive international tokamak being built in southern France by a consortium of seven members (China, the European Union, India, Japan, South Korea, Russia, and the United States), was originally proposed in the 1980s. Its updated schedule targets first plasma operations and deuterium-tritium burning by 2035. The project’s budget has ballooned repeatedly, and its timeline has slipped by over a decade from early estimates. ITER is designed to demonstrate that a tokamak can produce 10 times more fusion power than it consumes, but it is explicitly not a power plant. It won’t generate electricity. It’s a science experiment meant to prove the concept works at scale, with a commercial follow-up reactor still decades further out.

The ITER experience highlights a broader pattern: fusion projects are so large, so complex, and involve so many untested components that costs and timelines spiral in ways that are difficult to predict or control.

New Magnets Could Change the Economics

One of the most significant recent advances is in superconducting magnets. Traditional superconductors could theoretically achieve fusion, but only in reactors so large they’d never be economically viable. A newer material called REBCO (rare-earth barium copper oxide) operates at 20 kelvins rather than the near-absolute-zero temperatures older superconductors need, and it produces much stronger magnetic fields in a smaller package. After a successful demonstration in 2021, MIT researchers estimated that REBCO magnets reduced the projected cost per watt of a fusion reactor by a factor of nearly 40.

Stronger magnets mean smaller tokamaks, and smaller tokamaks mean lower construction costs and faster iteration. This is the technology behind Commonwealth Fusion Systems’ SPARC reactor, currently under construction and targeting 2027 for demonstrating net energy from fusion. If SPARC succeeds, it would be the first commercially relevant fusion device to produce more energy from the fusion reaction than it consumes to run.

Fusion May Struggle to Compete on Cost

Even if all the engineering problems are solved, fusion still has to be cheap enough to justify building. Analysis published in Energy Policy found that early fusion plant designs would likely produce electricity at more than $150 per megawatt-hour. For context, solar and wind energy are already significantly cheaper than that in most markets, and even new nuclear fission plants in Western countries, which have experienced notorious cost overruns, are projected at $80 to $100 per megawatt-hour with standardized construction.

For fusion to be competitive beyond 2040, it would need to hit that same $80 to $100 range. That’s a steep target for a first-of-its-kind technology with no supply chain, no standardized designs, and no operational track record. Costs typically come down with experience and mass production, but the first generation of fusion plants will be expensive by definition, and someone has to be willing to finance them.

Fusion Waste Is Better Than Fission, but Not Zero

Fusion is often described as “clean” energy, and compared to fission it is. The fusion reaction itself produces only helium, which is harmless. There’s no chain reaction that can melt down, and no long-lived radioactive fuel waste like the plutonium and cesium isotopes from fission reactors, some of which remain dangerous for hundreds of thousands of years.

However, fusion isn’t waste-free. Those high-energy neutrons that bombard the reactor walls activate the structural materials, turning ordinary steel components into radioactive waste. The specific isotopes produced (cobalt-60, nickel-63, manganese-54, among others) depend on the steel alloy used. This activated material is less dangerous and shorter-lived than fission waste, typically requiring storage on the order of decades to a century rather than millennia. But it still needs to be managed, and it adds to the cost and regulatory complexity of decommissioning a fusion plant.

Why It’s Taking So Long

Fusion requires simultaneous breakthroughs in plasma physics, materials science, superconductor technology, fuel supply, and power plant engineering. Progress in one area doesn’t help much if the others aren’t ready. You can build a perfect magnet, but if the wall materials can’t survive neutron bombardment, the reactor still fails. You can achieve ignition in a lab, but if you can’t breed tritium at scale, you can’t fuel a power plant.

Funding has also been a persistent constraint. For most of fusion’s history, government research budgets have been modest relative to the scale of the problem. A widely cited graph from the 1970s showed that fusion development timelines tracked directly with funding levels, and the actual funding path corresponded to the “fusion never” trajectory. Private investment has surged in recent years, with billions flowing into startups like Commonwealth Fusion Systems, TAE Technologies, and others, but these companies are still years from demonstrating commercial viability.

The honest answer is that fusion isn’t blocked by any single showstopper. It’s blocked by a dozen hard problems that all need to be solved together, in a package that’s economically competitive with energy sources that keep getting cheaper on their own.