UV spectroscopy is an analytical technique that measures how much ultraviolet and visible light a substance absorbs. By shining a beam of light through a sample and recording which wavelengths get absorbed, it reveals information about the sample’s chemical structure, identity, and concentration. The technique covers wavelengths from about 190 to 800 nanometers, with the near-ultraviolet region (200 to 400 nm) carrying photon energies up to 143 kcal/mole and the visible region (400 to 800 nm) ranging from 36 to 72 kcal/mole.
How UV Absorption Works
Every molecule has electrons sitting in defined energy levels. When UV or visible light hits a sample, photons with exactly the right energy can bump an electron from its normal resting orbital (the highest occupied molecular orbital) up to a higher-energy, usually empty orbital (the lowest unoccupied molecular orbital). The molecule is then in what chemists call an excited state. The instrument detects which wavelengths of light were consumed during this process, producing an absorption spectrum: essentially a graph of how strongly the sample absorbs at each wavelength.
Not all electron jumps require the same amount of energy. Of the several possible types of transitions, only two fall within the energy range that standard instruments can measure. The first involves electrons in double or triple bonds jumping to higher-energy antibonding orbitals. A carbon-carbon double bond like the one in ethylene, for example, absorbs strongly at 171 nm. The second type involves “lone pair” electrons (those sitting on atoms like oxygen or nitrogen rather than shared in a bond) jumping to antibonding orbitals. A carbon-oxygen double bond like the one in acetaldehyde absorbs at 290 nm through this kind of transition. The specific wavelength where a molecule absorbs most strongly is called its absorption maximum, and it acts like a fingerprint for that particular chemical group.
Key Parts of a UV Spectrophotometer
A UV spectrophotometer has four main components: a light source, a monochromator, a sample holder, and a detector.
Two light sources cover the full wavelength range. A deuterium lamp produces UV light from 190 to 350 nm, while a halogen lamp handles the visible and near-infrared range from 330 to 3,200 nm. The instrument switches between these automatically so the entire spectrum can be scanned in one run.
The monochromator is the component that isolates individual wavelengths from the broad-spectrum light. It works by bouncing light off a diffraction grating, a finely grooved surface that spreads white light into its component wavelengths the way a prism does. Adjustable slits control how narrow a slice of wavelengths reaches the sample, which determines the instrument’s resolution. High-end instruments use a double monochromator with two gratings and two sets of slits for sharper wavelength selection and less stray light.
After passing through the sample (typically held in a small transparent container called a cuvette), the remaining light hits a detector. Photomultiplier tubes and silicon photodiodes are common choices for UV and visible wavelengths. These work by converting incoming photons into an electrical signal, exploiting the photoelectric effect. The instrument compares the light intensity before and after the sample to calculate absorbance.
Beer-Lambert Law: Measuring Concentration
The core equation behind quantitative UV spectroscopy is the Beer-Lambert Law:
A = ε × l × c
A is the absorbance the instrument reads (a unitless number). The variable ε (epsilon) is the molar absorptivity, a constant that describes how strongly a particular substance absorbs at a given wavelength. The variable l is the path length, or how far the light travels through the sample in centimeters. And c is the concentration of the substance in moles per liter.
The practical takeaway is straightforward: absorbance is directly proportional to concentration. Double the amount of a substance in solution, and the absorbance reading doubles. This linear relationship is what makes UV spectroscopy so useful for measuring how much of something is present. In practice, you measure several samples of known concentration, plot absorbance versus concentration to create a calibration curve, and then read unknown concentrations off that line.
Common Chromophores and Their Absorption
A chromophore is the part of a molecule responsible for absorbing UV or visible light. Different chromophores absorb at characteristic wavelengths, which helps identify what functional groups are present in a sample. Some of the most commonly encountered chromophores include:
- Carbon-carbon double bonds (C=C): Absorb around 171 nm with strong intensity. Triple bonds absorb near 180 nm.
- Carbonyl groups (C=O): Show two distinct absorptions. The lone-pair transition appears around 290 nm but is quite weak, while the bond-electron transition appears near 180 nm and is much stronger.
- Nitro groups (N=O): Similar to carbonyls, with a weak lone-pair absorption near 275 nm and a stronger one around 200 nm.
- Carbon-halogen bonds (C-X): Absorb through lone-pair transitions. Methyl bromide absorbs at 205 nm, while methyl iodide absorbs at 255 nm, shifting to longer wavelengths as the halogen gets heavier and its electrons become easier to excite.
When a molecule contains a more extended system of alternating double bonds, the absorption maximum shifts to longer wavelengths, sometimes into the visible range. This is why many brightly colored compounds, from plant pigments to synthetic dyes, absorb visible light and can be studied with the same instrument.
Solvent Selection Matters
The solvent you dissolve your sample in has to be transparent at the wavelengths you’re measuring, or it will create background absorption that obscures the data. Every solvent has a UV cutoff wavelength below which it starts absorbing significantly. Water is transparent down to about 180 nm, making it one of the best choices for UV work. Ethanol’s cutoff is around 205 nm, meaning it works well for most applications but can interfere with measurements at very short wavelengths. Hexane, another common choice, is similarly transparent through most of the UV range and is preferred for nonpolar samples.
Pharmaceutical and Quality Control Uses
UV spectroscopy remains one of the most widely used techniques in pharmaceutical testing because of its simplicity, speed, and low cost. Drug manufacturers use it to verify the identity and measure the concentration of active ingredients in tablets, capsules, and other dosage forms. A typical quality control test involves dissolving a tablet, running a UV scan, and comparing the measured concentration against the labeled amount. Published validation studies routinely recover 99% or more of the expected drug content using UV methods, confirming the technique’s accuracy for routine analysis.
Dissolution testing is another major application: placing a tablet in simulated stomach or intestinal fluid and measuring, at timed intervals, how much drug has dissolved. Because UV readings take seconds and the instruments are relatively inexpensive, they’re ideal for the high-throughput environment of pharmaceutical manufacturing, where hundreds of samples may need testing in a single day.
DNA and Protein Quantification
In molecular biology labs, UV spectroscopy is the standard method for measuring DNA, RNA, and protein concentrations. Nucleic acids absorb strongly at 260 nm, while proteins absorb at 280 nm. A single reading at the right wavelength, combined with Beer-Lambert calculations, gives you the concentration in seconds.
Purity assessment relies on simple absorbance ratios. The 260/280 ratio indicates whether a DNA sample is contaminated with protein: a value of approximately 1.8 is generally accepted as pure DNA. If the ratio is noticeably lower, protein contamination is likely. A second ratio, 260/230, flags contamination from organic solvents and salts left over from the extraction process. Pure DNA typically falls between 2.0 and 2.2 for this ratio. One important caveat: because UV absorbance can’t distinguish DNA from RNA, the presence of RNA in a DNA sample can inflate both the concentration reading and the 260/280 ratio, so this possibility needs to be considered when interpreting results.
Modern microvolume spectrophotometers have pushed sample requirements down dramatically. Instruments now routinely analyze samples of just 1 to 4 microliters, with detection limits as low as 0.15 nanograms per microliter for double-stranded DNA at the most sensitive path-length setting. Some semiautomated systems can process and store up to six analyses per minute, making them practical for high-throughput genomics workflows where hundreds of samples need quantification before downstream processing.

