Spectral bandwidth is the range of wavelengths that pass through an optical instrument at any given moment, measured at half the peak intensity of the light signal. It tells you how “pure” the light is that reaches the detector. A spectrophotometer set to 500 nm, for example, doesn’t actually deliver a single wavelength. It delivers a narrow band centered on 500 nm, and the width of that band is the spectral bandwidth.
How Spectral Bandwidth Is Measured
The standard way to quantify spectral bandwidth is called Full Width at Half Maximum, or FWHM. Picture the intensity of light coming through an instrument plotted as a curve. The peak of that curve is the target wavelength. Now draw a horizontal line at exactly half the peak’s height. The distance between the two points where that line crosses the curve is the FWHM, and that distance, expressed in nanometers, is your spectral bandwidth.
If an instrument has a spectral bandwidth of 2 nm and is set to 500 nm, it’s actually passing light from roughly 499 nm to 501 nm. A bandwidth of 10 nm on the same setting would pass light from about 495 nm to 505 nm. The smaller the number, the more selective the instrument is about which wavelengths reach the detector.
What Controls It: Slits and Gratings
Inside a typical spectrophotometer, white light enters through an entrance slit, hits a curved mirror that straightens the beam, then strikes a rotating diffraction grating. The grating spreads the light into its component wavelengths, like a prism does with a rainbow. A second mirror focuses these separated wavelengths onto an exit slit. Only the wavelengths that physically fit through the exit slit reach the detector, and the rotation angle of the grating determines which wavelengths those are.
The width of the slits directly determines the spectral bandwidth. Wider slits let more wavelengths through, producing a broader bandwidth. Narrower slits restrict the range, producing a tighter bandwidth. The relationship is straightforward: bandwidth equals the slit width multiplied by the reciprocal linear dispersion of the grating. In practice, whichever slit is larger (entrance or exit) defines the bandwidth. Below a certain slit width, narrowing the entrance slit further won’t improve resolution because the system’s optical limits take over.
The Resolution vs. Signal Tradeoff
Spectral bandwidth creates a fundamental tradeoff between two things you want: resolution and signal strength. Resolution is the instrument’s ability to distinguish two absorption peaks that sit close together in wavelength. A narrow bandwidth gives you better resolution because the instrument is sampling a thinner slice of the spectrum at each step, so closely spaced peaks show up as separate features rather than blurring into one.
The cost of that narrow bandwidth is a weaker signal reaching the detector. Less light gets through narrow slits, which means more noise relative to the signal. Widen the slits and more photons reach the detector, giving you a cleaner, stronger reading, but you lose the ability to separate fine spectral details. A scan of hexane at 15 nm bandwidth, for instance, produces a smooth but featureless curve, while the same sample scanned at 3 nm bandwidth reveals considerably more spectral detail.
A widely used rule of thumb in UV-Vis spectroscopy: set the spectral bandwidth to one-tenth the natural bandwidth of the absorption band you’re measuring. This keeps peak distortion minimal while still allowing enough light through for a reliable measurement. If a compound has an absorption band 20 nm wide, you’d want an instrument bandwidth of about 2 nm.
Bandwidth vs. Bandpass vs. Resolution
These three terms overlap enough to cause confusion, but they describe different things. Bandpass is the specific range of wavelengths an instrument transmits, defined by a center wavelength and a span (for example, 400 to 410 nm). Bandwidth quantifies the width of that window (10 nm in that example). Spectral resolution is the instrument’s ability to distinguish between two closely spaced wavelengths, and it depends on bandwidth but isn’t identical to it.
Think of bandpass as the window, bandwidth as how wide the window is open, and resolution as how much detail you can see through it. A smaller bandwidth generally means finer resolution, but optical imperfections in mirrors, gratings, and detector pixels also play a role. The narrowest spectral bandwidth an instrument can achieve represents its maximum resolution.
Why It Matters in Practice
Spectral bandwidth affects the accuracy of every measurement an instrument makes. If it’s too wide, closely spaced peaks merge into one, measured absorbance values drop below their true values, and you can miss important features in a sample’s spectrum. If it’s too narrow, noise overwhelms the signal and readings become unreliable.
Pharmaceutical testing illustrates why this matters. Both the United States Pharmacopeia and the European Pharmacopoeia require spectrophotometers to pass a resolution test: measuring a dilute solution of toluene in hexane and calculating the ratio of absorbance at 269 nm and 266 nm. Those two peaks sit only 3 nm apart. An instrument with too broad a spectral bandwidth can’t separate them properly, producing an absorbance ratio that falls outside the acceptable range. Failing this test means the instrument isn’t suitable for regulatory work.
Beyond laboratory spectroscopy, spectral bandwidth applies anywhere an optical system isolates specific wavelengths. In fiber optic communications, it defines how tightly packed wavelength channels can be without interfering with each other. In astronomy, it determines how finely a telescope’s spectrograph can distinguish the chemical signatures of distant stars. The core concept is always the same: how narrow a slice of the spectrum the system can isolate, and the practical consequences of making that slice wider or thinner.

