The diffraction limit is the fundamental boundary on how fine a detail any optical system can resolve, set by the wave nature of light itself. No matter how perfectly you build a lens or mirror, light bends as it passes through an opening, creating a slightly blurred spot instead of a perfect point. This blurring puts a hard floor on the smallest features a microscope, telescope, camera, or even a chip-manufacturing machine can distinguish.
Why Light Can’t Focus to a Perfect Point
When light passes through any circular opening, whether it’s a camera lens, a telescope mirror, or the pupil of your eye, it doesn’t land as a crisp dot on the other side. Instead, it spreads into a small bright disk surrounded by faint rings of light. This pattern is called an Airy disk, and it forms because light waves interfere with each other as they squeeze through the aperture.
The size of that central bright disk depends on two things: the wavelength of the light and the diameter of the opening it passes through. Shorter wavelengths produce smaller disks. Larger openings also produce smaller disks. This relationship is captured in a simple formula introduced by Lord Rayleigh: the smallest angular separation you can resolve equals 1.22 times the wavelength divided by the aperture diameter (θ = 1.22λ/D). Two objects closer together than this angle will blur into a single blob, no matter how perfect the optics are.
What the Rayleigh Criterion Actually Means
The Rayleigh criterion gives a precise definition of “just barely distinguishable.” Two point sources of light, like two stars close together in the sky, are considered resolvable when the center of one object’s Airy disk falls on the first dark ring of the other object’s Airy disk. At that spacing, you can still tell there are two separate sources. Any closer, and they merge into one.
For visible light (roughly 400 to 700 nanometers in wavelength), this sets a concrete limit. Your eye, with a pupil about 5 millimeters wide, can separate two points only if they’re at least about 1 arc minute apart. A 10-meter telescope working at visible wavelengths can resolve details roughly 100,000 times finer. The physics is the same in both cases. The only difference is the size of the aperture collecting the light.
The Diffraction Limit in Microscopy
For microscopes, the diffraction limit works slightly differently because you’re imaging objects at close range rather than at astronomical distances. In the late 19th century, Ernst Abbe worked out that the smallest feature a light microscope can resolve is approximately half the wavelength of the illuminating light, divided by a property called the numerical aperture of the objective lens. Numerical aperture accounts for both the refractive index of the medium between the lens and the sample (air, water, or oil) and the angle over which the lens collects light.
In practice, this means a conventional light microscope tops out at roughly 200 to 250 nanometers of resolution when using visible light. That’s sufficient to see a bacterium or a cell nucleus, but too coarse to distinguish many internal structures within a cell. Protein complexes, individual virus particles, and the fine architecture of membranes and filaments all fall below this threshold. For over a century, this was treated as an impassable wall in biology.
How Telescopes Push the Boundary
In astronomy, the diffraction limit explains why bigger telescopes see sharper images. Doubling the diameter of a mirror cuts the minimum resolvable angle in half, revealing twice as much detail on a distant planet or galaxy. NASA’s James Webb Space Telescope, with a 6.5-meter primary mirror working in infrared wavelengths, reaches a diffraction-limited resolution far beyond what smaller instruments can achieve.
Ground-based telescopes face an additional problem: turbulence in Earth’s atmosphere blurs light before it even reaches the mirror, often degrading resolution well before the diffraction limit kicks in. Technologies like adaptive optics, which use deformable mirrors that reshape hundreds of times per second to counteract atmospheric distortion, allow large ground telescopes to approach their true diffraction-limited performance.
Chip Manufacturing and Shorter Wavelengths
The semiconductor industry has spent decades in a direct contest with the diffraction limit. To etch smaller transistors onto silicon wafers, manufacturers need to project finer patterns of light, and the diffraction limit says finer patterns require shorter wavelengths. The progression tells the story clearly: mercury-vapor UV lamps enabled features below 1 micrometer in the 1970s. Deep ultraviolet excimer lasers at 193 nanometers pushed features down to 65 nanometers, then below 20 nanometers using clever tricks like multiple exposures.
The current frontier is extreme ultraviolet (EUV) lithography, which uses light at a wavelength of just 13.5 nanometers. This has allowed companies like TSMC to manufacture chips at 5-nanometer process nodes. The next generation of EUV scanners, with higher numerical apertures of 0.55, is expected to achieve spatial resolution below 8 nanometers. Every step in this progression has been a direct response to the same underlying physics: shorter wavelengths shrink the Airy disk and allow finer features.
Super-Resolution: Getting Past the Wall
Starting in the 2000s, physicists and biologists developed techniques that sidestep the diffraction limit entirely, earning a Nobel Prize in Chemistry in 2014. These super-resolution methods don’t break the laws of physics. Instead, they exploit clever tricks to extract information that conventional imaging smears away.
One approach, called STED microscopy, uses two laser beams. The first beam excites fluorescent molecules in a sample. The second, shaped like a donut, switches off fluorescence everywhere except a tiny spot at the center, effectively shrinking the area that glows at any given moment to well below the diffraction limit. By scanning this sharpened spot across the sample, the microscope builds an image with resolution that conventional optics can’t match.
A different strategy, used by techniques known as PALM and STORM, takes advantage of the fact that individual molecules can be switched on and off randomly. In each frame, only a sparse handful of fluorescent molecules light up. Because they’re spread far apart, each one’s position can be pinpointed with precision far better than the width of its Airy disk. After thousands of frames, the precise locations are compiled into a single composite image with nanometer-scale detail.
Computational Approaches
Software can also partially recover information lost to diffraction. Deconvolution algorithms work by mathematically modeling how the optical system blurs a perfect point of light (its point spread function) and then reversing that blurring in the recorded image. Through successive rounds of iteration, these algorithms progressively sharpen features and reduce noise, allowing researchers to measure structures slightly smaller than the classical diffraction limit would normally permit.
A 2025 study in Nature Communications demonstrated a system combining hardware and computational refinement that resolved fluorescent point sources at 116 nanometers, compared to 292 nanometers with conventional wide-field imaging of the same objects. Deconvolution doesn’t eliminate the diffraction limit, but it can recover a meaningful amount of the detail that diffraction obscures, especially when the optical system and sample are well characterized.
Why It Matters Beyond the Lab
The diffraction limit isn’t just an abstract physics concept. It determines what your smartphone camera can resolve at a given distance, why satellite imagery has a maximum sharpness for a given orbit and mirror size, and why biologists couldn’t watch proteins interact inside living cells until super-resolution tools arrived. It shapes the design of eyeglasses, fiber-optic cables, laser cutters, and barcode scanners. Any time light passes through an opening and you care about the sharpness of what comes out the other side, the diffraction limit is the ceiling you’re working under.

