What Is the A260/A280 Ratio and What Does It Mean?

The A260/A280 ratio is a standard measure used in molecular biology laboratories to quickly assess the purity of a sample, typically DNA or RNA. This single number, generated by a spectrophotometer, provides an immediate indication of how clean a nucleic acid preparation is from common contaminants. By comparing the amount of light absorbed at two specific wavelengths, scientists can estimate the quality of a genetic sample before proceeding with sensitive experiments.

Understanding the Components of the Measurement

The ratio is calculated from two distinct absorbance readings, corresponding to the wavelengths where different molecules absorb ultraviolet light most strongly. The A260 refers to the absorbance reading taken at 260 nanometers (nm). This specific wavelength is the peak absorption point for nucleic acids, including both DNA and RNA, due to the structure of the nucleotide bases. The A260 value is primarily used to determine the concentration of the genetic material in the sample.

The A280 is the absorbance reading at 280 nm, indicating the presence of common protein contaminants. Aromatic amino acids, such as Tryptophan and Tyrosine, absorb UV light strongly at this wavelength. Phenol, a chemical used in some nucleic acid extraction methods, also exhibits strong absorption near 280 nm.

The A260/A280 ratio compares the signal from the desired molecule (nucleic acid at 260 nm) to the signal from contaminants (protein and phenol at 280 nm). A pure nucleic acid sample should have a high ratio because the A260 reading will be significantly greater than the A280 reading. Any factor that increases the A280 value relative to A260 suggests contamination.

Interpreting the Numerical Ratio

The numerical value of the A260/A280 ratio indicates sample purity, though the ideal number varies between DNA and RNA. For purified double-stranded DNA, the expected ratio typically falls around 1.8. A pure RNA sample requires a slightly higher ratio, usually around 2.0 to 2.1.

A ratio lower than the ideal value often signals protein or phenol contamination. For instance, a DNA sample ratio of 1.5 suggests that protein is absorbing light at 280 nm, increasing the ratio’s denominator. This low ratio indicates the sample is not sufficiently pure for many molecular biology applications.

Conversely, a ratio higher than the ideal, such as 2.3 for a DNA sample, also indicates a purity problem. This high value may be caused by residual chaotropic salts or an unusually high pH in the sample buffer. Residual RNA can also result in a higher A260/A280 reading, as RNA naturally has a higher ratio than DNA.

Why Nucleic Acid Purity Is Essential

The purity of a nucleic acid sample, indicated by the A260/A280 ratio, determines the success of subsequent molecular biology experiments. Contaminants like proteins or organic solvents interfere with the enzymes used in downstream reactions. For example, residual proteins can act as inhibitors, preventing enzymes like DNA polymerase from properly replicating the template DNA in a Polymerase Chain Reaction (PCR).

A poor ratio often leads to inaccurate quantification of the genetic material, which can cause significant experimental errors. If contaminants artificially inflate the A260 reading, a scientist may overestimate the true DNA concentration, leading to too little sample being used in the next step. This under-loading can result in faint or failed signals in sequencing or cloning experiments.

Inhibitory substances affect the efficiency of enzymatic processes, such as reverse transcription, which converts RNA into complementary DNA. Failure to achieve an acceptable purity ratio leads to unreliable or uninterpretable results, wasting time and costly reagents. Ensuring high purity is a quality control step before moving to sensitive applications like gene expression analysis or next-generation sequencing.

Factors That Influence the Ratio

Several factors can cause the measured A260/A280 ratio to deviate from the expected range. Residual chemicals from the nucleic acid extraction process, such as phenol, ethanol, or guanidine salts, are common contaminants that skew the readings. Phenol and guanidine absorb strongly near 280 nm, directly contributing to a lower ratio.

Solution pH

The pH of the solution in which the nucleic acid is dissolved exerts a substantial influence on the ratio measurement. An acidic solution, such as unbuffered water, causes the ratio to be artificially lower by as much as 0.2 to 0.3 units, making a pure sample appear contaminated. Conversely, a basic solution can artificially increase the reading.

Sample Dilution and Turbidity

The presence of particulate matter or turbidity in the sample affects absorbance readings across the entire spectrum. Furthermore, measuring extremely dilute samples, typically below 10 ng/µl, results in inaccurate ratio values as the instrument’s detection limits are approached. For accurate results, the sample should be measured in a buffered solution with a stable, slightly basic pH.