What Is Optical Density and How Is It Measured?

Optical density (OD) is a measure of how much light a sample blocks. It’s calculated as the logarithm of the ratio between the light going into a sample and the light coming out the other side. The higher the optical density, the less light passes through. An OD of 1 means only 10% of the light makes it through, while an OD of 2 means only 1% does.

How Optical Density Works

A spectrophotometer, the instrument used to measure OD, shines a beam of light at a known intensity through a sample and measures how much light reaches a detector on the other side. The instrument then calculates the logarithmic ratio of the input light intensity to the output light intensity. That ratio is the optical density.

The reason for using a logarithmic scale rather than a simple percentage is practical. When you double the concentration of a substance in solution, the transmitted light doesn’t drop by half. It drops exponentially. The logarithmic scale converts that exponential relationship into a straight line, making it far easier to work with. If you plot OD against concentration, you get a neat, linear graph (at least within a useful range).

This relationship comes from the Beer-Lambert Law, which combines two principles: light intensity decreases exponentially as the concentration of the absorbing substance increases, and it also decreases exponentially as the distance the light travels through the substance increases. In practice, that means OD depends on three things: what’s in your sample, how much of it there is, and how thick the sample is.

Optical Density vs. Absorbance

People often use “optical density” and “absorbance” interchangeably, but they’re not quite the same thing. Absorbance specifically refers to light that gets absorbed by molecules in the sample, converting light energy into other forms. Optical density is broader. It captures any reduction in transmitted light, whether that reduction comes from absorption, scattering, or both.

This distinction matters most in microbiology. When you measure the OD of a bacterial culture, you’re not really measuring how much light the bacteria absorb. You’re measuring how much light the cells scatter away from the detector. The bacteria deflect light in all directions as it passes through the cloudy suspension, and this creates the appearance of increased absorbance on the readout. The instrument reports it the same way, but the underlying physics is different. In a clear, colored solution (like a dye dissolved in water), OD and absorbance are essentially the same. In a turbid sample full of particles, they diverge.

Measuring Bacterial Growth With OD600

One of the most common uses of optical density is tracking how fast bacteria multiply. Researchers shine light at a wavelength of 600 nanometers (in the orange-red part of the visible spectrum) through a bacterial culture and record the OD value over time. This measurement, called OD600, is a staple of microbiology labs because it’s fast, cheap, simple, and can be automated on plate readers running dozens of samples at once.

The 600 nm wavelength is chosen deliberately. At that wavelength, most growth media and common biological molecules don’t absorb much light on their own, so nearly all the signal comes from the bacterial cells scattering light. As the cells divide and the culture gets cloudier, OD600 climbs.

There’s an important caveat, though. OD600 doesn’t directly tell you how many cells are in the culture. The relationship between OD and actual cell count is only linear within a limited range, and it varies between instruments because scattering depends on the geometry of the light source, sample holder, and detector. To convert OD into a real cell count, labs calibrate their readings against a reference method, typically by diluting a sample, spreading it on plates, and counting the colonies that grow. This calibration step is essential for comparing results across different labs or different instruments.

Optical Density in Medical Testing

OD readings play a critical role in clinical lab tests, particularly in a type of assay called ELISA (enzyme-linked immunosorbent assay). These tests are used to detect antibodies or proteins in blood samples, and they work by triggering a color change that’s proportional to the amount of the target molecule present. The more target molecule in the sample, the deeper the color, and the higher the OD value the instrument records.

Clinicians interpret these results using a cutoff value. For example, in a common test for a blood-clotting disorder called heparin-induced thrombocytopenia, an OD above 0.40 is considered positive. In one study evaluating this threshold, the test correctly identified 93.8% of confirmed cases and correctly ruled out 96.6% of unlikely cases. Higher OD values generally indicate stronger reactions, so some labs use tiered cutoffs to assess not just whether a result is positive but how strongly positive it is. In that same study, an OD of 0.427 was identified as the best threshold for predicting blood clot complications, with 65.4% sensitivity and 88.4% specificity for that more specific question.

What Affects Accuracy

Several factors can throw off an OD reading. Path length (the thickness of the sample the light passes through) directly affects the result, which is why standard lab cuvettes have a fixed 1 cm path length. Using a different container without correcting for it will give you misleading numbers.

Concentration also matters, but not in the way you might expect. The linear relationship between OD and concentration only holds up to a point. At very high concentrations, the sample absorbs or scatters so much light that the detector can’t reliably distinguish small differences. Most spectrophotometers give reliable results in the OD range of roughly 0.1 to 1.0. Beyond that, samples typically need to be diluted before measurement.

For imaging-based OD measurements (used in pathology to quantify staining in tissue samples), precision is harder to achieve. Uneven illumination across a microscope field, optical artifacts, and the three-dimensional structure of cells all introduce error. Careful correction for these factors can bring measurement variability down to about 5%, compared to roughly 3.5% for flow cytometry analyzing the same specimen.

Common Wavelengths and Their Uses

The wavelength of light you choose depends entirely on what you’re trying to measure. Each application has a sweet spot:

  • 600 to 700 nm for bacterial cultures, where you want to minimize interference from the growth medium and measure scattering from cells.
  • 680 nm for green microalgae cultures, because this wavelength coincides with the absorption peak of chlorophyll, giving a strong signal tied to algal biomass.
  • 260 nm (ultraviolet) for DNA quantification, since nucleic acids absorb UV light strongly at this wavelength.
  • 280 nm for protein concentration, where aromatic amino acids in proteins absorb UV light.

Choosing the right wavelength maximizes the sensitivity of your measurement and minimizes background noise from other substances in the sample. A wavelength that works perfectly for bacteria would be a poor choice for measuring DNA concentration, and vice versa.

The OD Scale

Optical density is unitless. It’s a pure ratio expressed on a logarithmic scale. An OD of 0 means all the light passes through (a perfectly clear sample). An OD of 1 means 90% of the light is blocked. An OD of 2 means 99% is blocked. An OD of 3 means 99.9% is blocked.

In practice, you rarely see values above 2.5 or so, because at that point so little light reaches the detector that measurement noise overwhelms the signal. In radiography, the practical range of densities runs from about 0.25 up to about 2.5. In biology and chemistry, most useful measurements fall between 0.1 and 2.0, with the most reliable readings in the lower half of that range.