DQE, or detective quantum efficiency, is the single most important measure of how well a digital X-ray detector converts incoming X-ray photons into a useful image. Expressed as a percentage, it describes how much of the information carried by X-rays actually ends up as signal in the final image rather than being lost to noise. A perfect detector would have a DQE of 100%, meaning every X-ray photon contributes perfectly to image quality. Real detectors fall well short of that, with current clinical systems ranging roughly from the low 30s to the mid-60s percent, and performance varying by as much as three- to four-fold between devices.
What DQE Actually Measures
Every X-ray that hits a detector carries information. Some of that information makes it into the image as useful signal, and some gets drowned out by noise. DQE captures this tradeoff in a single number: it’s the ratio of the image’s signal-to-noise quality (squared) to the number of X-ray photons that struck the detector. A higher DQE means the detector is more efficient at turning those photons into a clear, readable image.
This matters directly for patient care. A detector with high DQE can produce a diagnostic-quality image with fewer X-rays, which means lower radiation dose to the patient. Conversely, a low-DQE detector wastes much of the radiation it receives, requiring either a higher dose to compensate or accepting a noisier, lower-quality image.
Why DQE Changes With Detail Size
DQE isn’t a single fixed number for any detector. It changes depending on the level of detail you’re trying to capture, described in physics terms as “spatial frequency.” At low spatial frequencies (large structures like bones or organs), detectors generally perform at their best. As spatial frequency increases (smaller details like fine bone texture or microcalcifications), DQE drops, often dramatically.
In one study of a mammography flat-panel detector, DQE peaked around 0.65 (65%) at low spatial frequencies but fell to roughly 0.13 to 0.17 (13 to 17%) at the highest frequencies the detector could resolve. This steep decline is driven largely by the detector’s ability to preserve sharpness at fine detail levels, a property called the modulation transfer function (MTF). When sharpness degrades at high frequencies, the useful signal shrinks relative to noise, and DQE drops accordingly.
This is why DQE is typically reported as a curve across spatial frequencies rather than a single value, though the value near zero frequency is often quoted as a shorthand for overall efficiency.
The Three Ingredients of DQE
Physicists calculate DQE from three measurable properties of the detector:
- Modulation transfer function (MTF): How well the detector preserves sharpness. A detector that blurs fine details has a low MTF at high spatial frequencies, which drags down DQE.
- Noise power spectrum (NPS): A map of how noise is distributed across different detail sizes. More noise at any given frequency reduces DQE.
- Input signal-to-noise ratio: The quality of the X-ray beam hitting the detector, determined by the number of photons and their energy.
DQE essentially asks: given the quality of the X-ray beam coming in, how much of that quality survives through the detector to appear in the image? A detector with excellent sharpness (high MTF) and low noise (low NPS) will have a high DQE.
Direct vs. Indirect Detectors
Modern digital X-ray detectors come in two main designs, and their DQE performance differs in important ways.
Direct conversion detectors use a material (typically amorphous selenium) that converts X-rays straight into electrical charge. This produces very sharp images because there’s no intermediate step to blur the signal. Studies show direct detectors achieve MTF values very close to the theoretical ideal set by their pixel size, and their noise stays consistent across all spatial frequencies.
Indirect conversion detectors use a two-step process: X-rays first hit a scintillator material that produces visible light, and then a light-sensitive panel converts that light into an electrical signal. The extra optical step introduces some blurring, which reduces MTF and DQE at higher spatial frequencies. However, indirect detectors can achieve higher X-ray absorption, which boosts their DQE at low frequencies.
In a head-to-head comparison, an indirect flat-panel system (GE Revolution XQ/i) measured 64% DQE at low spatial frequencies, while a direct system (Hologic DR-1000) reached 38% at the same frequencies. At higher frequencies (2.5 line pairs per millimeter), both systems converged to around 20% DQE. The indirect system’s advantage at low frequencies came from its scintillator’s superior ability to absorb X-rays, while the direct system’s sharper MTF helped it hold its own at finer detail levels.
Scintillator Material Matters
For indirect detectors, the choice of scintillator material significantly affects DQE. The two most common scintillators are cesium iodide (CsI) and gadolinium oxysulfide (GOS).
CsI is grown in tiny columnar crystals that act like fiber optic channels, guiding light toward the sensor with less sideways spread. This preserves sharpness, especially in thicker layers. GOS is a powder-based material that scatters light more broadly, which limits how thick you can make it before blurring becomes a problem. Because thicker scintillators absorb more X-rays, CsI’s columnar structure gives it a more favorable tradeoff between absorption and sharpness.
Research comparing the two materials confirmed that a 1 mm CsI scintillator achieved the highest spatial resolution and DQE across all frequencies tested. At higher X-ray energies (the kind used for chest or abdominal imaging at 100 to 140 kVp), both materials struggle with absorption, and indirect flat-panel detectors typically achieve less than 50% DQE under these conditions. This is one reason detector manufacturers continue experimenting with thicker scintillators and alternative irradiation geometries to squeeze out better performance.
The Quantum Sink Problem
Inside any imaging chain, there are stages where the number of information carriers (whether X-ray photons, light photons, or electrical charges) drops to a minimum. These bottlenecks are called “quantum sinks,” and they set a ceiling on the detector’s DQE.
The most obvious quantum sink is X-ray absorption itself. If a detector only absorbs 60% of incoming X-rays, 40% of the available information is lost immediately, and no amount of downstream processing can recover it. But secondary quantum sinks can occur later in the chain, particularly in indirect detectors where X-ray photons are converted to light. If too few light photons are produced per absorbed X-ray, this optical stage becomes an additional bottleneck.
At low spatial frequencies, both the X-ray absorption stage and the optical stage can limit DQE. At higher spatial frequencies, the optical quantum sink tends to dominate. Research has shown that increasing the system’s overall light output (its “gain”) by a factor of nine or more can eliminate these secondary quantum sinks up to moderate spatial frequencies, effectively letting the detector reach its theoretical DQE limit set by X-ray absorption alone.
Lab DQE vs. Real-World Performance
Standard DQE measurements are made under controlled laboratory conditions with a narrow X-ray beam and idealized geometry. Clinical imaging introduces factors like scatter radiation from the patient’s body, focal spot blurring, and magnification effects that degrade the image in ways not captured by conventional DQE testing.
To address this gap, researchers developed a metric called “effective DQE” (eDQE) that accounts for real-world imaging conditions. The results are sobering: at 120 kVp (a common setting for chest imaging), eDQE near zero frequency measured only 8 to 9%, roughly five times lower than the standard DQE measured on the same system. This means detectors perform substantially worse in practice than their lab specifications suggest, and it underscores why simply comparing DQE numbers from spec sheets can be misleading when evaluating clinical performance.
The practical takeaway is that a detector’s DQE is necessary but not sufficient for predicting real image quality. Factors like anti-scatter grids, tube focal spot size, and imaging geometry all interact with the detector’s intrinsic efficiency to determine what the radiologist actually sees on screen.

