What Is Shor’s Algorithm? Quantum Factoring Explained

Shor’s algorithm is a quantum computing procedure that can factor large numbers exponentially faster than any known method running on a traditional computer. Developed by mathematician Peter Shor in 1994, it matters because the security of most modern encryption depends on the assumption that factoring huge numbers is practically impossible. A sufficiently powerful quantum computer running Shor’s algorithm could break that assumption.

Why Factoring Large Numbers Matters

Most internet encryption, including the RSA system that protects banking, email, and secure websites, works by multiplying two very large prime numbers together. The result is a number that’s easy to compute in one direction (multiplying) but extraordinarily hard to reverse (figuring out which two primes produced it). A classical computer trying to factor a 2,048-bit number, the current standard for RSA keys, would need longer than the age of the universe using the best known algorithms.

This one-way difficulty is the entire foundation of public-key cryptography. Your browser uses it every time you see a padlock icon. Shor’s algorithm threatens this foundation because it turns an impossibly slow problem into a tractable one, at least in theory.

How Classical Computers Factor Numbers

To appreciate what Shor’s algorithm does differently, it helps to understand the classical approach. The most efficient known classical method, called the general number field sieve, can factor large numbers but slows down dramatically as the numbers get bigger. Its running time grows “sub-exponentially,” meaning that adding digits to the number makes the computation vastly harder in a way that quickly becomes unmanageable. For a 300-digit number, even the world’s fastest supercomputers would struggle for years. For a 600-digit number, the task becomes essentially impossible within any practical timeframe.

What Shor’s Algorithm Does Differently

Shor’s algorithm exploits quantum mechanics to factor numbers in “polynomial time,” which means the difficulty grows much more gently as numbers get larger. Where a classical computer’s workload might double many times over with each additional digit, a quantum computer running Shor’s algorithm faces only a modest, manageable increase.

The algorithm works in two main phases. The first phase runs on a classical computer and reframes the factoring problem as a problem of finding a repeating pattern in a sequence of numbers. Specifically, it picks a random number and asks: how frequently does a certain mathematical relationship repeat? This repeating cycle is called the “period,” and once you know it, simple arithmetic reveals the original prime factors.

The second phase is where the quantum computer comes in. Finding that period classically would take an impractical amount of time for large numbers. But quantum computers can search for periodic patterns with extraordinary efficiency using a technique called the quantum Fourier transform. This is the core quantum step, and it’s what gives the algorithm its speed advantage.

The Quantum Fourier Transform

The quantum Fourier transform is a quantum version of a widely used mathematical operation that identifies repeating patterns in data. Classical computers use Fourier transforms constantly, from compressing audio files to analyzing signals. The quantum version does something similar but operates on quantum bits (qubits) that exist in overlapping states simultaneously.

Because qubits can represent many values at once through a property called superposition, the quantum Fourier transform can analyze an enormous number of possibilities in parallel. When the qubits are measured at the end, the physics of quantum interference makes the correct period far more likely to appear than incorrect answers. It’s not that the quantum computer tries every possibility one by one. Instead, it sets up a situation where wrong answers cancel each other out and the right answer reinforces itself, similar to how overlapping waves can amplify or cancel depending on their alignment.

This step is efficient enough that the entire algorithm runs in roughly proportional time to the cube of the number of digits in the number being factored. For context, doubling the number of digits only increases the work by about eight times, rather than the astronomical increases a classical computer would face.

What It Would Take to Break RSA

Shor’s algorithm has been demonstrated on real quantum hardware, but only for tiny numbers. In 2001, an IBM team famously factored the number 15 (into 3 × 5) using a 7-qubit quantum computer. More recent experiments have factored slightly larger numbers, but nothing close to cryptographically relevant sizes.

Breaking a standard 2,048-bit RSA key would require a quantum computer with several thousand logical qubits. The catch is that real quantum hardware is noisy and error-prone, so each “logical” qubit needs many physical qubits for error correction. Current estimates suggest that millions of physical qubits would be needed. As of the mid-2020s, the largest quantum processors have around 1,000 to 1,200 physical qubits, and their error rates are still too high for running Shor’s algorithm at meaningful scale. The gap between current hardware and what’s needed remains enormous.

That said, the trajectory of quantum computing development is why governments and organizations are already preparing. The concern isn’t just about today’s secrets but about data intercepted now and decrypted later, once quantum computers become powerful enough. This “harvest now, decrypt later” strategy means that sensitive information with a long shelf life is already at risk in a practical sense.

Post-Quantum Cryptography

The response to Shor’s algorithm has been the development of encryption methods that resist quantum attacks. In 2024, the U.S. National Institute of Standards and Technology (NIST) finalized its first set of post-quantum cryptographic standards. These new systems rely on mathematical problems that neither classical nor quantum computers can solve efficiently, such as problems involving lattice structures in high-dimensional spaces.

Major tech companies have already begun integrating these new standards into browsers, messaging apps, and cloud services. The transition is expected to take years because encryption is embedded in virtually every layer of digital infrastructure, from web servers to smart cards. But the migration is underway, driven largely by the theoretical threat that Shor’s algorithm represents.

What Shor’s Algorithm Cannot Do

Shor’s algorithm is specifically designed for factoring integers and a closely related problem called the “discrete logarithm.” It does not break all encryption. Symmetric encryption methods, like the AES system used to protect files and communications once a connection is established, are not vulnerable to Shor’s algorithm. Quantum computers do offer a speedup against symmetric encryption through a different algorithm (Grover’s algorithm), but that speedup is far less dramatic and can be countered simply by using longer keys.

Shor’s algorithm also doesn’t make quantum computers universally faster than classical ones. It solves a specific class of problems with a particular mathematical structure. Many everyday computing tasks, from running a spreadsheet to training an AI model, gain no benefit from it. Its significance is narrow but profound: it targets the exact mathematical assumption that most of the internet’s security architecture was built on.