A spectrophotometer is a scientific instrument that measures the intensity of light as it passes through a sample solution. By passing a specific wavelength of light through a liquid, the instrument detects how much light is absorbed or transmitted by the chemical components within the sample. The core function is to provide a precise, objective measurement of a substance’s interaction with light, which is directly related to its amount in the solution.
Understanding Absorbance and Transmittance
The spectrophotometer presents its primary readings as transmittance and absorbance. Transmittance (T) represents the fraction or percentage of the original light that successfully passes through the sample and reaches the detector. A reading of 100% Transmittance means all the light passed through, while 0% Transmittance means all the light was blocked or absorbed by the solution.
Absorbance (A) is the preferred unit for most quantitative scientific analysis because of its proportional relationship to concentration. Absorbance measures the amount of light the sample has stopped or absorbed, and it is mathematically derived from the transmittance reading. The relationship between the two is inverse and logarithmic; as absorbed light increases, transmitted light decreases.
A solution with high Absorbance, such as 2.0, corresponds to a low Transmittance of only 1% of the original light. Conversely, an Absorbance of 0.0 means 100% of the light was transmitted.
Preparing the Spectrophotometer (The Blank)
The spectrophotometer must be calibrated using a reference sample known as the “blank.” The purpose of the blank is to account for and remove any background light absorption that is not caused by the substance being analyzed. This background noise can come from the solvent used to dissolve the sample, the material of the cuvette holding the liquid, or other reagents mixed into the solution.
The blank solution contains every component of the actual sample except the chemical compound of interest. For example, if a protein is dissolved in a salt buffer, the blank would be the salt buffer alone. This reference solution is placed into the instrument, and the spectrophotometer is then set to zero Absorbance or 100% Transmittance.
This procedural step tells the machine to ignore any light blockage caused by the solvent or container, setting a clean baseline. When the true sample is measured immediately afterward, the resulting Absorbance value is purely a result of the compound being studied. Skipping this blanking step leads to artificially inflated and inaccurate readings.
Converting Absorbance into Concentration
The quantitative relationship between a substance’s Absorbance and its concentration is governed by the Beer-Lambert Law. This law dictates that the amount of light absorbed by a solution is directly proportional to the concentration of the light-absorbing molecules within it.
To translate a measured Absorbance value into a specific concentration, scientists rely on creating a “standard curve.” This involves preparing a series of solutions, each containing a known concentration of the substance being analyzed. The Absorbance of each of these standard solutions is then measured by the spectrophotometer.
These pairs of data points—known concentration on one axis and measured Absorbance on the other—are plotted on a graph, which should yield a straight line. This line, called the standard or calibration curve, acts as a visual and mathematical reference. Once the curve is established, the Absorbance of an unknown sample is measured, and that value is simply traced over to the standard curve.
By finding where the unknown’s Absorbance meets the line and dropping down to the concentration axis, the actual concentration of the unknown sample can be determined. This process relies on a simple graphical interpretation to quantify the substance in the solution.

