Particle size is measured using techniques that range from passing material through stacked sieves to tracking how particles scatter laser light. The best method depends on the size range you’re working with: sieve analysis handles particles down to about 37 micrometers, laser diffraction covers 0.2 to 2,000 micrometers, and electron microscopy can resolve features as small as 0.1 nanometers. Each technique measures a slightly different property, so understanding what you’re actually measuring matters as much as the number you get back.
Sieve Analysis for Coarse Particles
Sieve analysis is the simplest and oldest approach. You stack a series of wire mesh screens with progressively smaller openings, place your sample on top, and shake. Particles fall through each screen until they reach one with openings too small to pass. What remains on each sieve tells you the fraction of your sample within that size range.
Sieve sizes follow standardized mesh numbers. A No. 325 mesh sieve has openings of 44 micrometers, while a No. 400 mesh (the finest commonly available) has openings of 37 micrometers. Larger sieves are designated by the size of their opening in inches. The mesh number for smaller sieves refers to the number of openings per linear inch. This method works well for dry powders, granules, sand, and similar materials, but it cannot characterize anything finer than about 37 micrometers. It also measures only one dimension of each particle, since an elongated particle can slip through a sieve opening that’s smaller than its longest axis.
Laser Diffraction for a Wide Size Range
Laser diffraction is the most widely used technique in industrial and pharmaceutical settings. A laser beam passes through a cloud of dispersed particles, and the particles scatter light at angles that depend on their size. Smaller particles scatter light at wider angles; larger particles scatter at narrower angles. Detectors arranged around the sample capture the scattered light pattern, and software converts that pattern into a size distribution.
The math behind this conversion relies on two optical models. For particles larger than roughly 10 micrometers (about four to five times the laser wavelength), the Fraunhofer approximation works well and doesn’t require any knowledge of the particle’s optical properties. For smaller particles, instruments apply Mie scattering theory, which accounts for how light refracts through and gets absorbed by the particle. Mie theory is more accurate but requires you to know the refractive index of both the particles and the medium they’re dispersed in. Instruments like the Malvern Mastersizer series use Mie theory to cover particle sizes from 0.2 to 2,000 micrometers in a single measurement. ISO 13320, the international standard governing laser diffraction methods (most recently updated in 2020), sets out requirements for repeatability and reproducibility to ensure consistent results across instruments and operators.
Dynamic Light Scattering for Submicron Particles
Dynamic light scattering (DLS) measures particles too small for laser diffraction to handle reliably, typically in the nanometer to low-micrometer range. Instead of measuring how a laser beam bends around particles, DLS monitors the intensity fluctuations of scattered light caused by Brownian motion, the random jiggling of tiny particles as they’re bumped around by molecules in the surrounding liquid.
Smaller particles move faster, and larger particles move slower. The instrument measures these speed differences and uses the Stokes-Einstein equation to calculate a hydrodynamic radius for each particle. This calculation requires knowing the temperature and the viscosity of the liquid, both of which are straightforward to control or measure. DLS handles a broad concentration range (roughly 10⁸ to 10¹² particles per milliliter), making sample preparation relatively flexible. The trade-off is that DLS is intensity-weighted, meaning a small number of large particles can dominate the signal. Adding even a tiny amount of 1,000-nanometer particles to a sample can shift the average size reading by about 40 nanometers.
Nanoparticle Tracking Analysis
Nanoparticle tracking analysis (NTA) takes a different approach to the same Brownian-motion principle. Instead of averaging scattered light across the whole sample, NTA uses a camera to track individual particles moving through a thin illuminated volume. Software follows each particle’s path, calculates its speed, and assigns it a size. Because each particle is measured individually, NTA produces high-resolution size distributions that can distinguish between populations of similar-sized particles.
This individual-tracking approach gives NTA a major advantage over DLS for polydisperse samples, those containing a broad mix of sizes. Where DLS would blur two nearby peaks into one, NTA can resolve them clearly. NTA is also far less sensitive to contamination by a few large particles, since those particles are counted individually rather than allowed to dominate the signal. The limitation is concentration. NTA requires between 10⁷ and 10⁹ particles per milliliter, a much narrower window than the broad range DLS tolerates. Samples outside that range need dilution or concentration before measurement.
Electron Microscopy for Direct Imaging
When you need to see the actual shape and size of individual particles, electron microscopy is the gold standard. Two types dominate this space. Scanning electron microscopy (SEM) bounces a focused electron beam off the surface of a sample, producing detailed three-dimensional images of surface features down to about 10 to 30 nanometers. Transmission electron microscopy (TEM) sends electrons through an ultra-thin sample and can resolve features as small as 0.1 nanometers, enough to see individual atomic lattice planes.
The obvious advantage is that you’re looking at real particles, not inferring size from how they scatter light or how fast they move. You can see shape, surface texture, and internal structure. The disadvantages are equally clear: sample preparation is time-consuming (especially for TEM, which requires samples thin enough for electrons to pass through), imaging is slow, and you typically measure only hundreds or thousands of particles rather than millions. This makes electron microscopy excellent for validation and quality control but impractical for routine, high-throughput sizing.
Interpreting Size Distribution Results
No matter which technique you use, particle size results are almost always reported as a distribution rather than a single number. Real-world samples are polydisperse, containing particles of many different sizes. Three values capture the essentials of any distribution: D10, D50, and D90. The D10 value is the size below which 10% of your sample falls. D50 (also called the median) splits the distribution in half. D90 is the size below which 90% of your sample falls. So a report reading D10 = 83 µm, D50 = 330 µm, D90 = 1,600 µm tells you that the middle of the distribution sits at 330 micrometers, the fine fraction extends below 83 micrometers, and the coarse tail reaches up past 1,600 micrometers.
To describe how wide or narrow the distribution is, you can calculate the span: (D90 minus D10) divided by D50. A small span means most particles are similar in size. A large span means the sample contains a broad range. For the example above, the span is (1,600 – 83) / 330 = 4.6, which represents a very wide distribution. Tight distributions with spans below 1.0 are typical for well-controlled manufacturing processes.
Sample Stability Affects Your Results
Particles in suspension can clump together (agglomerate) or settle out before you measure them, and either problem will skew your results toward larger sizes. Zeta potential, a measure of the electrical charge on particle surfaces, is a useful predictor of how stable your suspension is. Particles with a zeta potential between 0 and ±5 millivolts will rapidly clump together. Values between ±10 and ±30 millivolts indicate borderline stability. You need ±30 to ±40 millivolts for moderate stability, ±40 to ±60 for good stability, and above ±61 millivolts for excellent stability.
If your suspension is unstable, your size measurement reflects agglomerates rather than primary particles. Sonication, surfactants, or pH adjustments can help break up clumps and stabilize the dispersion before measurement. This is why laser diffraction instruments typically include a built-in ultrasonic probe or stirrer in the sample chamber.
Choosing the Right Technique
- Coarse powders and granules (37 µm and up): Sieve analysis is cheap, simple, and requires no specialized equipment beyond calibrated sieves and a shaker.
- Broad range with high throughput (0.2 to 2,000 µm): Laser diffraction is the industry workhorse. Fast, reproducible, and governed by international standards.
- Submicron particles in liquid (1 nm to a few µm): DLS is quick and handles a wide concentration range. Best for samples where you expect a narrow distribution.
- Polydisperse nanoparticles (30 nm to 1 µm): NTA gives better resolution than DLS for mixed populations, but demands tighter concentration control.
- Validation or shape analysis (0.1 nm and up): Electron microscopy provides direct images but is slow and measures limited numbers of particles.
Keep in mind that different techniques measure different equivalent diameters. Sieve analysis measures a geometric opening size, laser diffraction reports a volume-equivalent sphere diameter, and DLS reports a hydrodynamic diameter that includes any surface coatings or hydration layers. Two techniques applied to the same sample will often produce slightly different numbers, and neither is wrong. They’re simply measuring different physical properties. For this reason, it’s important to specify which technique was used whenever you report a particle size value.

