What Will Quantum Computers Be Able to Do: Real Capabilities

Quantum computers will eventually solve specific categories of problems that today’s most powerful supercomputers cannot touch, from breaking modern encryption to simulating the exact behavior of molecules for drug design and clean energy. Most of these capabilities are still years away, but the path is increasingly clear, and some early advantages are already showing up in laboratory settings.

The key distinction: quantum computers won’t replace your laptop or make Netflix load faster. They excel at problems involving massive combinatorial possibilities, where the answer hides in an enormous landscape of options that classical machines must check one by one. Quantum processors explore many of those options simultaneously through the physics of superposition and entanglement.

Breaking Today’s Encryption

The most discussed (and feared) quantum capability is cracking the encryption that protects banking, government communications, and virtually all internet traffic. A mathematical technique called Shor’s algorithm can factor enormous numbers in a fraction of the time classical computers need. Since modern encryption relies on the assumption that factoring those numbers is practically impossible, a sufficiently powerful quantum computer would render current security obsolete.

For the widely used RSA-2048 standard, early estimates suggested you’d need around 20 million noisy qubits and about 8 hours of computation. More recent analysis has brought that down to fewer than one million qubits and roughly 5 days, under realistic assumptions about error rates. Today’s largest quantum processors have around 1,000 to 1,200 qubits, so this threat isn’t imminent, but it’s close enough that governments and tech companies are already rolling out quantum-resistant encryption standards.

Designing New Drugs at the Molecular Level

Simulating how molecules behave is one of the hardest problems in science. When a pharmaceutical company wants to understand how a drug candidate interacts with a protein, classical computers can only approximate the quantum mechanical behavior of the electrons involved. The bigger the molecule, the worse the approximation gets. Quantum computers, being quantum mechanical systems themselves, can simulate these interactions natively.

Researchers have already built hybrid workflows that combine classical computing with quantum machine learning for drug design. One notable example used this approach to study the SARS-CoV-2 protease and its mutants, combining quantum techniques with traditional molecular docking to evaluate drug candidates. The real payoff comes when quantum hardware scales up enough to simulate entire protein binding sites with chemical accuracy, something no classical method can achieve for large, biologically relevant molecules.

Protein folding is another target. Researchers have used quantum annealing to find low-energy shapes of simplified protein models, essentially searching for how a protein naturally wants to fold. Classical protein folding tools like AlphaFold predict structure from patterns in data. Quantum approaches could eventually predict folding from the actual physics, catching cases where data-driven methods fall short.

Reinventing Industrial Chemistry

The Haber-Bosch process, which produces ammonia for fertilizer, consumes roughly 1 to 2 percent of global energy. It requires extreme heat and pressure. Nature, meanwhile, manages the same chemistry at room temperature using an enzyme called nitrogenase. The active site of this enzyme contains a cluster of iron and molybdenum atoms whose exact catalytic mechanism remains an open question, because simulating it accurately exceeds what classical computers can do.

A landmark study published in the Proceedings of the National Academy of Sciences demonstrated how a quantum computer could be used to calculate the reaction energies and energy barriers within this catalyst, providing reliable estimates “beyond the reach of traditional methods.” If researchers can fully understand how nitrogenase works, they could design synthetic catalysts that fix nitrogen without burning fossil fuels. The same simulation capability applies to designing catalysts for carbon capture, hydrogen production, and other reactions central to the energy transition.

Better Batteries Through Molecular Simulation

Lithium-ion batteries degrade partly because of chemical reactions at the boundary between the electrode and the electrolyte, forming a layer called the solid electrolyte interphase. Understanding and controlling this layer is critical for improving battery life and energy density. Classical quantum chemistry methods can model small electrolyte molecules, but the reactive chemistry at this interface involves complex, correlated electron behavior that pushes those methods to their limits.

Quantum computers could simulate these electrochemical processes with far greater accuracy, helping researchers identify electrolyte formulations that form more stable interfaces. The same principle extends to next-generation battery chemistries like lithium-sulfur and solid-state designs, where the molecular interactions are even more complex and less well understood.

Faster Financial Risk Analysis

Banks and hedge funds rely heavily on Monte Carlo simulations, which estimate financial risk by running millions of randomized scenarios. Pricing complex derivatives, calculating portfolio risk, and stress-testing investments all depend on this approach. Classical Monte Carlo methods are slow because accuracy improves only gradually as you add more simulations.

Quantum algorithms offer a fundamental mathematical speedup for these calculations. Standard quantum approaches provide a quadratic speedup, meaning they reach the same accuracy with far fewer computational steps. For nested financial problems like pricing a call option on a call option, or computing risk measures like Value-at-Risk, newer quantum algorithms show even more dramatic improvements. One recent result demonstrated that a quantum method could estimate nested option prices with a cost that scales roughly as 1/ε (where ε is the desired accuracy), compared to 1/ε⁴ for classical nested estimators. In practical terms, that’s the difference between a calculation taking minutes versus days.

Sharper Machine Learning

Quantum computing intersects with machine learning in two ways: speeding up the learning process itself, and encoding data in richer ways that classical algorithms can’t replicate. The more promising near-term direction is the second one, using quantum systems to map data into high-dimensional spaces where patterns become easier to separate.

A recent experiment on a photonic quantum processor demonstrated this concretely. Researchers ran a classification task using quantum-enhanced kernel methods, a technique that measures similarity between data points in a transformed space. The photonic kernels outperformed leading classical approaches, including Gaussian kernels and neural tangent kernels (which simulate infinitely wide neural networks). The performance boost came specifically from quantum interference, a phenomenon with no classical equivalent. Applications being explored include medical image classification, satellite image analysis, natural language processing, and information retrieval.

Where the Technology Stands Now

Current quantum computers are powerful enough to demonstrate advantages on narrow, carefully chosen problems. China’s Jiuzhang photonic processor performed a sampling task 100 trillion times faster than what was then the world’s top supercomputer. But sampling tasks aren’t practically useful on their own. They serve as proof that quantum hardware can do things classical hardware fundamentally cannot.

The gap between these demonstrations and the real-world applications described above is primarily about error correction. Today’s qubits are noisy, meaning they lose their quantum state quickly and introduce errors into calculations. Useful quantum computing requires either dramatically better physical qubits or large numbers of noisy qubits working together to form reliable “logical” qubits. IBM’s 2025 roadmap focuses on demonstrating error correction codes and modular processor designs, with a 120-qubit processor aimed at enabling more complex calculations. The push toward fault-tolerant quantum computing, where errors are corrected fast enough to run long algorithms reliably, is the central engineering challenge of the next five to ten years.

The applications closest to reality are molecular simulation and optimization problems, where even imperfect quantum hardware combined with classical computing can begin to offer advantages. Fully breaking RSA encryption and training quantum-enhanced AI models at scale sit further out, likely requiring machines with hundreds of thousands to millions of high-quality qubits. The timeline is uncertain, but the trajectory across every major hardware platform points toward fault-tolerant systems emerging in the late 2020s to early 2030s.