Defining Absorbance and Transmittance
Spectroscopy uses light to analyze the composition of chemical samples dissolved in a solution. When light passes through a sample, some of it is stopped or scattered by the molecules present. Scientists measure this interaction using Absorbance and Transmittance.
Absorbance quantifies the amount of light absorbed by the sample and is measured on a logarithmic scale. This measurement is reported without units and increases as less light passes through the sample.
Transmittance is a measure of the light that successfully passes through the sample and reaches the detector. This value can be expressed as a fraction (T), ranging from 0 to 1, or as a percentage (%T), ranging from 0% to 100%.
The relationship is inversely proportional: high Absorbance means low Transmittance. Understanding the mathematical connection is necessary to accurately interpret and compare data.
The Formula Connecting Absorbance and Transmittance
The relationship between Absorbance ($A$) and Transmittance ($T$) is logarithmic, not linear. Absorbance is defined as the base 10 logarithm of the ratio of the incident light intensity ($I_0$) to the transmitted light intensity ($I$). The primary formula is $A = -\log_{10}(T)$.
$T$ is expressed as the fraction $I/I_0$, meaning $T$ is always between 0 and 1. Since the logarithm of a number between 0 and 1 is negative, the negative sign ensures Absorbance is reported as a positive value. This positive Absorbance increases as the concentration of molecules in the sample increases.
To find Transmittance from a known Absorbance value, the equation is algebraically rearranged. The Transmittance fraction can be found using the inverse function. This yields the formula $T = 10^{-A}$, which is the core relationship for the calculation.
Once the fractional Transmittance ($T$) is determined, the percent Transmittance ($\%T$) is calculated by multiplying $T$ by 100. The full formula is $\%T = T \times 100$.
Converting Absorbance to Percent Transmittance
The calculation from Absorbance to Percent Transmittance is a two-step process. This conversion is necessary when data needs to be reported in the percentage format. Using a hypothetical Absorbance reading of $A = 0.5$, the conversion begins by determining the fractional Transmittance ($T$).
The first step involves applying the formula, $T = 10^{-A}$. For the example, this means calculating $T = 10^{-0.5}$. Most scientific calculators have a dedicated $10^x$ function for this calculation.
Performing the calculation yields $T = 0.316$. This number represents the fraction of light that passed through the sample. The second step is to convert this fraction into $\%T$.
The conversion is performed by multiplying $T$ by 100, following the formula $\%T = T \times 100$. Taking $T=0.316$, the result is a percent Transmittance of $31.6\%$.
A sample with an Absorbance of $0.5$ allows $31.6\%$ of the light to pass through it. Consider a sample with $A = 2.0$. Applying $T = 10^{-2.0}$ yields $T = 0.01$. Multiplying by 100 shows that the percent Transmittance is $1.0\%$. This demonstrates the logarithmic nature of the relationship.
Real-World Uses for the Conversion
Although modern spectrophotometers often display both values, understanding the calculation is useful for data comparison and analytical requirements. Absorbance is the preferred measurement in laboratory settings due to its linear relationship with concentration. This linearity, described by the Beer-Lambert Law, allows scientists to correlate an Absorbance reading to the amount of substance present, simplifying quantitative analysis.
The conversion to Percent Transmittance is valuable when reporting data for quality control or visual assessment. A reading of $75\%T$ is more intuitive than an Absorbance reading of $0.125$, especially for non-scientific audiences. For instance, a product specification might require a solution to transmit at least $85\%$ of light, making $\%T$ the standard metric for reporting compliance.
The conversion also helps when comparing data sets derived from older instruments that reported only one type of measurement. Converting all readings to a common format ensures the integrity and comparability of historical data with modern results.

