Quantum annealing is a method of computation that uses quantum physics to find the best solution among a huge number of possibilities. It works by exploiting a quantum effect called tunneling, which lets the system pass directly through energy barriers instead of climbing over them the way classical computers must. This makes it particularly suited to optimization problems, where the goal is to find the lowest-cost, shortest-path, or most efficient arrangement from millions or billions of options.
How Quantum Annealing Works
Imagine a vast, hilly landscape where every valley represents a possible solution and the deepest valley is the best one. A classical computer solves this by hopping from valley to valley, climbing up over ridges and descending again, checking whether the new valley is deeper than the last. This is essentially what traditional simulated annealing does: it uses the computational equivalent of heat to bounce around the landscape, gradually cooling down until it settles into a valley it hopes is the deepest.
Quantum annealing takes a different path. Instead of climbing over ridges, it tunnels through them. The system starts in a state of maximum quantum uncertainty, where all possible solutions are explored simultaneously. Then a control field is slowly dialed down, guiding the system toward lower and lower energy states. If this process happens slowly enough, the system naturally settles into the global minimum: the best solution.
The physics behind this relies on a principle called the adiabatic theorem. It states that if you change a quantum system gradually enough, it will stay in its lowest energy state the entire time. Quantum annealers begin with a simple system whose lowest energy state is easy to prepare, then slowly transform it into a complex system whose lowest energy state encodes the answer to your problem. The quantum tunneling effect is what gives the process its potential edge, allowing the system to slip through narrow barriers in the energy landscape that would trap a classical optimizer for a long time.
One notable advantage over classical approaches is flexibility in how quantum fluctuations are applied. Researchers have shown that adding different types of quantum interactions, not just the standard transverse field, can significantly accelerate convergence for certain problem types. This tunability has no direct analog in classical simulated annealing.
Quantum Annealing vs. Gate-Based Quantum Computing
Quantum annealing is not the same thing as the gate-based quantum computing you hear about from companies like IBM or Google. Gate-based systems manipulate qubits through sequences of logic gates, much like classical computers use AND, OR, and NOT gates. This approach is universal, meaning it can run any quantum algorithm, but it requires extraordinary precision. Most useful quantum algorithms need error correction that demands thousands of physical qubits for every single reliable “logical” qubit.
Quantum annealers sacrifice that universality for scale. They can only solve a specific class of problems (optimization and sampling), but they can do it with far more qubits because the hardware requirements are less demanding. D-Wave’s latest Advantage2 system runs 4,400 qubits. Gate-based machines, by contrast, currently top out at a few thousand qubits and can use only a fraction of those for actual computation after accounting for error correction overhead. The tradeoff is clear: annealers handle one type of problem with many qubits, while gate-based machines handle any problem but need vastly more physical qubits to do it reliably.
What Problems It Solves
Quantum annealing is built for optimization: situations where you need the best configuration out of a combinatorial explosion of possibilities. To use an annealer, you first translate your problem into a mathematical format called QUBO (quadratic unconstrained binary optimization), which represents your problem as a set of binary variables and the relationships between them. D-Wave provides an open-source toolkit called Ocean SDK that helps with this translation, along with libraries for mapping, preprocessing, and submitting problems to the hardware.
Real-world applications are already being tested. Aisin Corporation, a major Japanese automotive parts manufacturer, used quantum annealing to tackle truck routing in its supply chain. The company ships tens of thousands of boxes daily between numerous facilities, with driving times ranging from minutes to hours. Researchers framed the problem as a single-vehicle routing challenge with pickup, dropoff, and schedule constraints, then ran it on both quantum and classical annealing hardware. Logistics, scheduling, portfolio optimization in finance, and molecular simulation are all active areas of exploration.
Quantum annealers have also been used to simulate physical systems directly. Researchers have programmed superconducting annealing processors to model exotic magnetic materials, including frustrated magnets where competing interactions prevent the system from settling into a simple ordered state. These simulations can reveal phenomena like dynamical phase transitions and topological configurations that are difficult to study any other way.
The Speedup Question
Whether quantum annealing is actually faster than classical methods remains an open and contested question. Early simulation studies suggested that quantum annealing found low-energy states faster than classical simulated annealing for two-dimensional spin glass problems, a standard benchmark. However, a closer look published in Science revealed that this apparent advantage was partly an artifact of how the simulations were set up. When researchers corrected for time-discretization effects and measured average energy rather than cherry-picking the best result across simulation slices, the advantage disappeared.
Tests on physical D-Wave hardware have similarly failed to demonstrate a clear, definitive speedup over the best classical algorithms. This does not mean quantum annealing is useless. It means that for the problem types and sizes tested so far, classical heuristics remain competitive. The hope is that as qubit counts grow, connectivity improves, and coherence times lengthen, a scaling advantage will emerge for certain problem classes. But that has not been conclusively demonstrated yet.
Hardware Challenges
The biggest practical obstacles are thermal noise and decoherence. Quantum annealers operate at temperatures near absolute zero, but even tiny amounts of residual heat can knock the system out of its ground state. Thermal excitations populate higher-energy states, reducing the probability of landing on the correct answer. If the number of nearby wrong answers grows exponentially, thermal noise can make the computation unreliable.
There is also a timing problem. Even when the environment is only weakly coupled to the system and the correct answer has a reasonable equilibrium probability, the time needed for the system to actually reach that probability through thermal relaxation can be orders of magnitude longer than the ideal evolution time. In other words, the annealer might theoretically be able to find the answer, but waiting for it to do so could erase any practical speed advantage.
Counterintuitively, some thermal effects can actually help. When the system passes through a critical point too quickly, it can swap the probabilities of the ground state and the first excited state, leaving the correct answer with near-zero probability. A small amount of thermal interaction can excite the system before this critical point and relax it afterward, effectively rescuing the computation. This phenomenon, called thermally assisted quantum annealing, has been experimentally confirmed on a 16-qubit problem.
Where the Hardware Is Headed
D-Wave, the only company currently selling commercial quantum annealers, has an ambitious roadmap. The current Advantage2 system with 4,400 qubits is set to receive a performance update in 2026 that will support new protocols including cyclic annealing, giving users finer control over the annealing process. The company is also developing a gate-based system using dual-rail qubit technology with built-in error detection, which it claims will require an order of magnitude fewer physical qubits per logical qubit than competing approaches. That system is expected to become available in 2026.
Longer term, D-Wave envisions scaling beyond 100,000 qubits through higher connectivity between qubits, advances in cryogenic control electronics, multi-chip configurations, and larger cooling enclosures. If realized, systems at that scale could tackle optimization problems far beyond the reach of current hardware, potentially crossing the threshold where a practical quantum advantage becomes undeniable.

