Nuclear fusion begins when hydrogen nuclei collide fast enough and hard enough to merge, which requires temperatures around 15 million degrees Kelvin (about 27 million °F). In nature, this happens in the cores of stars once gravity compresses enough mass into a small enough space. In laboratories on Earth, scientists have to push temperatures even higher, to 100-200 million degrees, because they can’t replicate the crushing pressure found inside a star.
Why Fusion Needs Extreme Conditions
Atomic nuclei carry positive charges, and positive charges repel each other. This repulsive force, called the Coulomb barrier, acts like a wall keeping two hydrogen nuclei apart. To smash through that wall using heat alone, you’d theoretically need temperatures around 11 billion degrees Kelvin. That’s far hotter than the center of any star.
The reason fusion actually works at much lower temperatures is quantum tunneling. Instead of flying over the barrier, particles have a small probability of passing straight through it, as if taking a shortcut under a wall. Even though this probability is tiny for any single collision, the sheer number of particles in a stellar core (or a fusion reactor) means enough successful collisions happen every second to sustain the reaction. Without this quantum effect, stars wouldn’t shine.
How Stars Ignite Fusion
A star begins as an enormous cloud of hydrogen gas collapsing under its own gravity. As the cloud shrinks, the core gets denser and hotter. Once the core temperature reaches roughly 15 million K and the density climbs to about 160 times that of water (ten times denser than gold), hydrogen nuclei start fusing into helium through what’s called the proton-proton chain. This is the process powering the Sun right now.
Not every collapsing gas cloud makes it. There’s a minimum mass threshold: an object needs at least about 0.08 solar masses (roughly 80 times the mass of Jupiter) for gravity to compress the core enough to reach fusion temperatures. Below that limit, the core becomes so tightly packed that a quantum pressure from electrons halts the compression before temperatures climb high enough. These failed stars are called brown dwarfs.
Brown dwarfs aren’t completely fusion-free, though. Objects above about 13 Jupiter masses can fuse deuterium, a heavier form of hydrogen that’s easier to ignite because its barrier is lower. This deuterium burning is short-lived since deuterium is scarce, but it’s the reason 13 Jupiter masses is often used as the dividing line between giant planets and brown dwarfs.
Fusion Beyond Hydrogen
Hydrogen fusion sustains a star for most of its life, but heavier elements require higher temperatures to fuse because their nuclei carry more positive charge and repel each other more strongly. When a massive star exhausts the hydrogen in its core, the core contracts and heats further.
At about 100 million K, helium nuclei begin fusing into carbon through a process called the triple-alpha reaction, where three helium nuclei combine in quick succession. In a star like the Sun, this ignition happens in a dramatic event called a helium flash at the tip of the red giant phase. Stars much more massive than the Sun continue this pattern, fusing progressively heavier elements (carbon, oxygen, neon, silicon) at ever higher temperatures, each stage shorter than the last, until they build an iron core and can extract no more energy from fusion.
Fusion Ignition on Earth
Recreating fusion in a laboratory is harder than it sounds. Stars have immense gravity holding their fuel together, giving particles countless chances to collide. On Earth, engineers have to compensate for the lack of gravitational pressure by pushing temperatures far higher, into the range of 100 to 200 million degrees, roughly ten times hotter than the Sun’s core. At the same time, the fuel plasma is surprisingly thin, about a million times less dense than air. The challenge is keeping it hot and confined long enough for enough fusion reactions to occur.
This balancing act is captured by a formula developed by physicist John Lawson in the 1950s. It ties together three variables: temperature, fuel density, and confinement time. All three have to be high enough simultaneously. Raise the temperature but lose confinement too quickly, and the plasma fizzles. Pack the fuel denser but not hot enough, and collisions won’t overcome the barrier. Fusion scientists use the “triple product” of these three values as the key benchmark for progress.
Two main approaches exist. Magnetic confinement, the strategy behind the ITER reactor under construction in France, uses powerful magnetic fields to hold a ring of plasma at around 150 million degrees Celsius. ITER is designed to produce a self-sustaining, burning plasma where the heat from fusion reactions keeps the fuel hot without continuous external heating.
Inertial confinement takes the opposite approach: compress a tiny pellet of fuel so fast and so hard that fusion happens in a burst lasting billionths of a second. In December 2022, the National Ignition Facility in California crossed a historic threshold. By delivering 2.05 megajoules of laser energy onto a fuel capsule, the experiment produced 3.15 megajoules of fusion energy, the first time a controlled fusion reaction on Earth generated more energy than was put into the fuel. That milestone confirmed the basic science behind inertial fusion energy, though turning it into a practical power source remains a separate engineering challenge.
The Short Answer, by Context
- In a forming star: Hydrogen fusion ignites once the core reaches about 15 million K, which requires at least 0.08 solar masses of material.
- In a substellar object: Deuterium fusion can begin at lower temperatures in objects above roughly 13 Jupiter masses.
- In an aging star: Helium fusion kicks in at about 100 million K after hydrogen is exhausted in the core.
- In a fusion reactor: Plasma must reach 100 to 200 million degrees while maintaining sufficient density and confinement time, as defined by the Lawson criterion.

