A visual field test, formally known as perimetry, objectively assesses the entire scope of a person’s vision, including both central and peripheral areas. This examination maps the sensitivity of the retina across its full extent, creating a functional picture of sight. The detailed printout contains data about how well an individual detects light at various points in their field of view. Interpreting this document requires understanding the testing method, validating patient performance, and decoding the numerical and graphical representations of visual function.
How the Visual Field Test Is Performed
Standard Automated Perimetry uses a bowl-shaped instrument, such as the Humphrey Field Analyzer, to perform static perimetry. The patient rests their head and looks at a central fixation light while the machine projects stationary light spots of varying intensities onto the inner surface of the bowl.
The goal of threshold testing is to determine the visual threshold: the dimmest light stimulus a patient can detect at a specific location about 50% of the time. Sensitivity is measured in decibels (dB), a logarithmic scale where a higher number indicates greater sensitivity to dimmer light. Testing typically uses 24-2 or 30-2 protocols, examining the central 24 or 30 degrees of vision, respectively. The patient signals detection by pressing a handheld button, and algorithms like SITA adjust the brightness of subsequent stimuli to quickly map the threshold values.
Checking the Test Reliability Metrics
Evaluating reliability metrics is the first step in reading the printout, as they confirm patient cooperation and test accuracy. If these values fall outside acceptable limits, the test results must be viewed with skepticism. Three primary indices gauge performance: Fixation Losses, False Positives, and False Negatives.
Fixation Losses (FL)
Fixation Losses measure how often the patient’s gaze drifts from the central target. The machine monitors this by occasionally presenting a light within the natural blind spot, where light should never be seen. If the patient responds to a light in this area, a loss is recorded, suggesting improper fixation. A percentage below 20% is generally acceptable.
False Positives (FP) and False Negatives (FN)
False Positives occur when the patient presses the response button when no light stimulus was presented, indicating anticipation. This results in artificially high sensitivity values, making the visual field appear better than it is; a rate above 15-33% is concerning. Conversely, False Negatives are recorded when a patient fails to respond to a light that should have been easily visible. A high False Negative rate, typically above 33%, suggests inattention, fatigue, or significant variability.
Decoding the Printout Maps and Values
The printout translates the patient’s light-detection performance into a series of maps and numerical tables.
Threshold Map
The raw data is displayed in the Threshold Map, listing the measured decibel (dB) value numerically for each tested point. Higher dB numbers, such as 30 dB, represent better visual sensitivity, while lower numbers, approaching 0 dB, signify an inability to detect even the brightest light.
Grayscale Map
The Grayscale Map converts the numerical data into a quick visual summary. Areas of high sensitivity are shown in lighter shades, while reduced sensitivity appears progressively darker, with the deepest black representing a scotoma. Although useful for rapid assessment, this map should not be used for definitive diagnosis because it averages data and can distort the true extent of a defect.
Total Deviation Map
The Total Deviation Map compares the patient’s measured sensitivity at every point to the average expected sensitivity for a healthy person of the same age. The map displays the difference in decibels. A negative number indicates the patient’s vision is worse than the age-matched average. This comparison highlights any generalized depression of the entire visual field, which can be caused by factors like cataracts.
Pattern Deviation Map
The Pattern Deviation Map is the most informative plot for identifying specific disease-related loss. It mathematically removes any generalized depression seen on the Total Deviation Map, isolating only localized, focal defects. For example, if a patient has reduced vision everywhere due to a cataract, this map corrects for the overall dimming. This allows clinicians to clearly see sharp, localized dips in sensitivity that suggest a condition like glaucoma.
Interpreting Global Indices and Loss Patterns
The final section summarizes the test results into Global Indices, which provide an overall measure of the field status and are valuable for tracking changes over time.
Mean Deviation (MD)
The Mean Deviation (MD) is the average of all differences found on the Total Deviation Map, reflecting the overall severity of visual field loss. A negative MD value indicates an average reduction in sensitivity compared to normal, with increasingly negative numbers signifying widespread loss. Because the MD reflects the average sensitivity across the entire field, it is strongly affected by non-specific causes of vision loss, such as media opacity.
Pattern Standard Deviation (PSD)
The Pattern Standard Deviation (PSD) measures the degree of irregularity in the visual field, indicating whether the loss is localized or diffuse. A high PSD suggests a non-uniform field with sharp peaks and valleys, characteristic of focal damage like the arcuate scotomas seen in glaucoma. A low PSD suggests a smooth field, which is either entirely normal or uniformly depressed.
Recognizing characteristic visual loss patterns often correlates with specific neurological or ocular diseases. For instance, a defect that arcs around the central fixation point, respecting the horizontal midline, is an arcuate scotoma, strongly associated with glaucoma. Conversely, a hemianopia, where vision is lost in an entire half of the visual field, indicates a neurological event, such as a stroke.

