Optical density (OD) is calculated using a simple logarithmic formula: OD = log₁₀(I₀/I), where I₀ is the intensity of light hitting your sample and I is the intensity of light that passes through it. If you’re working with percent transmittance, the formula simplifies to OD = 2 − log₁₀(%T). Most spectrophotometers do this math automatically, but understanding the calculation helps you troubleshoot readings and interpret your results correctly.
The Core Formula
Optical density measures how much light a sample blocks. When light passes through a solution, some of it gets absorbed. The more concentrated your sample, the more light it absorbs and the higher the OD value. An OD of 0 means all light passed through. An OD of 1 means only 10% of the light made it through. An OD of 2 means just 1% got through.
The relationship between these values comes from the Beer-Lambert Law:
- OD = log₁₀(I₀ / I), where I₀ is the incident light intensity and I is the transmitted light intensity
- OD = 2 − log₁₀(%T), when you’re starting from a percent transmittance reading
- OD = ε × c × d, when you want to relate absorbance to concentration
In that last version, ε is the molar extinction coefficient (a constant specific to each substance at a given wavelength, in units of M⁻¹cm⁻¹), c is the molar concentration of the substance, and d is the path length of light through the sample in centimeters. A standard cuvette has a path length of exactly 10 mm (1 cm), with a manufacturing tolerance of ±0.05 mm.
Converting Transmittance to OD
If your instrument gives you a transmittance reading instead of absorbance, the conversion is straightforward. Transmittance (T) is simply the fraction of light that passes through, expressed as a decimal between 0 and 1, or as a percentage between 0% and 100%.
For a decimal transmittance value: OD = −log₁₀(T). For percent transmittance: OD = 2 − log₁₀(%T). So a sample that transmits 50% of the light has an OD of 2 − log₁₀(50) = 2 − 1.699 = 0.301. A sample transmitting 10% gives OD = 2 − log₁₀(10) = 2 − 1 = 1.0.
Setting Up a Blank
Every OD measurement requires a blank, which is your solvent or buffer without the substance you’re measuring. The blank accounts for any light that gets absorbed or scattered by the cuvette walls, the solvent itself, or other components in your solution. You’re zeroing out everything except the thing you care about.
The process works the same across spectrophotometer models. First, set your desired wavelength. Then fill a clean cuvette about three-quarters full with your blank solution (solvent only) and place it in the sample compartment. Adjust the instrument to read 100% transmittance or 0.00 absorbance. Remove the blank and insert your sample cuvette. The reading you get now reflects only the absorbance of your target substance. Every time you change the wavelength, you need to re-blank the instrument, because the solvent and cuvette absorb different amounts of light at different wavelengths.
Fingerprints on the cuvette will scatter light and inflate your OD reading, so always handle cuvettes by their frosted sides or top edges.
Calculating DNA and RNA Concentration
One of the most common OD calculations in molecular biology uses absorbance at 260 nm (OD260) to determine nucleic acid concentration. The conversion factors are well established:
- Double-stranded DNA: 1 OD260 unit = 50 µg/mL
- Single-stranded DNA: 1 OD260 unit = 33 µg/mL
- Single-stranded RNA: 1 OD260 unit = 40 µg/mL
So if your dsDNA sample reads an OD260 of 0.35, the concentration is 0.35 × 50 = 17.5 µg/mL. These conversion factors assume a 1 cm path length, so if you’re using a microplate reader or a micro-volume instrument with a different path length, you’ll need to correct for that before multiplying.
Purity Ratios
OD readings at multiple wavelengths tell you whether your sample is clean. The OD260/OD280 ratio is the standard purity check. Pure DNA gives a ratio of approximately 1.8. Pure RNA runs slightly higher, around 2.0. If the ratio drops to 1.6 or below, your sample likely contains protein, phenol, or other contaminants that absorb more strongly at 280 nm.
A secondary check is the OD260/OD230 ratio. Pure DNA falls in the range of 2.0 to 2.2. Low values here point to contamination from salts, carbohydrates, EDTA, or residual chemicals left over from the extraction process. Checking both ratios together gives you a much more complete picture of sample quality than either one alone.
Estimating Bacterial Cell Density With OD600
In microbiology, OD measured at 600 nm is the standard way to track bacterial growth. The wavelength is chosen because most bacterial culture media don’t absorb much light at 600 nm, so the reading reflects light scattering by the cells themselves rather than absorbance by the broth.
A common rough conversion for E. coli is that an OD600 of 1.0 corresponds to roughly 1 × 10⁹ cells per mL (in a standard 96-well plate, actual values from calibration studies land closer to 1.6 × 10⁹). But this number varies significantly between species. Smaller bacteria like S. epidermidis pack more cells into the same OD: about 3.4 × 10¹⁰ cells/mL at OD600 = 1.0. Larger species like P. putida give roughly 5 × 10⁸ cells/mL at the same reading, because bigger cells scatter more light per cell.
The relationship between OD and cell count is only truly linear at low densities, typically below an OD of about 0.1. Above that, the curve bends because cells start shading each other from the light beam. If you need accurate counts at higher densities, either dilute your sample into the linear range before reading, or use a calibration curve specific to your organism and instrument. Research has shown that a fourth-degree polynomial fits the OD-to-cell-count relationship far better than a simple linear equation. The calibration is instrument-specific, so a curve built on one plate reader won’t necessarily be accurate on another.
Path Length Correction for Microplates
Standard cuvettes have a fixed 1 cm path length, but microplate wells do not. The path length in a microplate depends on the volume of liquid in the well, which means 200 µL and 100 µL of the same sample will give different OD readings in the same plate. Raw absorbance values from a microplate cannot be plugged directly into Beer-Lambert calculations without correcting for this.
The correction works by measuring the actual liquid depth in each well. Many plate readers do this automatically by reading the absorbance of the assay buffer at two near-infrared wavelengths (975 nm and 900 nm). The difference between these readings, compared against the same measurement in a standard 1 cm cuvette (called the K-factor), gives the true path length of liquid in each well. The software then scales the raw absorbance up or down so it matches what a 1 cm cuvette would have produced.
After path length correction, the absorbance values are independent of assay volume. Without it, adding more or less liquid to a well changes the apparent concentration even though the actual concentration hasn’t changed. If your plate reader doesn’t have automatic path length correction, you can calculate it manually by dividing the raw absorbance by the estimated path length in centimeters. For a standard flat-bottom 96-well plate, 200 µL of liquid gives a path length of roughly 0.56 cm.
Rearranging the Formula for Concentration
In most practical situations, you already have an OD reading and you need to find the concentration. Rearranging the Beer-Lambert equation gives you: c = OD / (ε × d). With a 1 cm cuvette, this simplifies to c = OD / ε. You just need the extinction coefficient for your substance at the wavelength you measured, which is available in published reference tables for thousands of compounds.
This only works within the linear range of your instrument. Most spectrophotometers are reliable between OD values of 0.1 and 1.0. Below 0.1, noise in the detector becomes a significant fraction of the reading. Above 1.0, so little light reaches the detector that small measurement errors get amplified. Above 2.0, readings become unreliable for most instruments. If your sample reads too high, dilute it, re-measure, and multiply the resulting concentration by your dilution factor.

