Fitts’s Law is a predictive model of human movement that says the time it takes to reach a target depends on two things: how far away the target is and how big it is. Smaller or more distant targets take longer to hit because you need to slow down to be precise. This simple relationship, first demonstrated in 1954, has become one of the most widely applied principles in user interface design.
Where the Law Came From
In 1954, psychologist Paul Fitts published a paper applying information theory to the human motor system. He ran three experiments: one where participants tapped a stylus back and forth between two metal plates, one where they transferred plastic washers between pins, and one where they moved small pins from one set of holes to another. In each experiment, he varied the size of the targets and the distance between them, then measured how quickly people could perform the task.
The pattern was remarkably consistent across all three tasks. As targets got smaller or farther apart, people slowed down in a predictable, logarithmic way. Fitts framed this in terms of information theory: hitting a small, distant target requires transmitting more “bits” of precision through your motor system, just as sending a complex message requires more bandwidth. The human body, he argued, has a measurable information capacity for movement.
The Formula
Fitts’s Law is typically expressed as:
Movement Time = a + b × log₂(D/W + 1)
Here, D is the distance to the target, W is the width of the target, and “a” and “b” are constants that vary depending on the person and the device being used. The logarithmic term is called the Index of Difficulty, measured in bits. A target that is far away and narrow has a high index of difficulty. A target that is close and wide has a low one.
This version of the formula, known as the Shannon formulation, was proposed by researcher Scott MacKenzie and has largely replaced Fitts’s original equation in modern research. It’s more theoretically grounded and better behaved at extreme values, which is why the ISO standard for evaluating input devices adopted it.
The constants “a” and “b” are determined experimentally. They change based on whether someone is using a mouse, a trackpad, a stylus, or their finger on a touchscreen. This is what makes the law useful for comparing devices: you can calculate a metric called throughput (Index of Difficulty divided by Movement Time, expressed in bits per second) that captures both speed and accuracy in a single number.
Why Targets Are Harder to Hit When They’re Small
The core mechanism behind Fitts’s Law is the speed-accuracy tradeoff. When you reach for something, your initial movement gets you close but rarely lands perfectly. The smaller the target, the more likely you are to overshoot or undershoot, requiring a corrective follow-up movement. Research has shown that these secondary, corrective movements are the main way people achieve precision. It’s not that you plan more carefully before moving toward a small target. You move, miss slightly, and correct. Higher difficulty means more corrections, which means more time.
This plays out identically whether you’re tapping a touchscreen, clicking a mouse, or moving your eyes. Studies on rapid eye movements found the same pattern: when targets were smaller or farther apart, people made more corrective eye movements, and the planning of those corrections overlapped with the next primary movement. The tradeoff between speed and accuracy is built into how the motor system works at a fundamental level.
How It Shapes Interface Design
Fitts’s Law gives designers a quantitative reason to make important buttons big and place them where they’re easy to reach. But its most powerful insight is subtler than “make buttons bigger.”
Screen edges and corners are special. When you’re using a mouse, the cursor stops at the edge of the screen no matter how fast you’re moving. You physically cannot overshoot. In Fitts’s Law terms, any target placed on a screen edge has effectively infinite width in one direction, which dramatically reduces movement time. The four corners of the screen are the fastest targets of all because they’re infinite in two directions.
This is why macOS places its application menu bar at the very top edge of the screen. You can fling the cursor upward without worrying about precision, and it will land in the menu area every time. Windows historically placed the Start button in the bottom-left corner for the same reason. A single-row toolbar that bleeds all the way to the screen edge will be significantly faster to use than one with even a single pixel of non-clickable space between the icons and the edge, because that one pixel destroys the infinite-width advantage.
This edge-pinning benefit applies to mice and trackballs but not to touchscreens, where your finger can move beyond the screen’s physical boundary. On a phone or tablet, there’s no cursor to pin.
Touch Targets on Mobile Devices
On touchscreens, Fitts’s Law matters in a different way. Your finger is an imprecise pointing instrument. Research from the MIT Touch Lab found that the average fingertip is 1.6 to 2 centimeters wide, and the average thumb pad is about 2.5 centimeters. When a touch target is smaller than the contact area of your finger, accuracy drops fast.
The practical minimum for a touch target is 1 cm × 1 cm (roughly 0.4 × 0.4 inches). This is a physical measurement, not a pixel count, because the same number of pixels can represent very different physical sizes depending on screen density. Targets below this threshold cause real usability problems. Instagram, for example, drew criticism for dismiss buttons that were only about 2 millimeters wide. Color swatches on some retail sites were as small as 1 millimeter, a size that worked fine with a mouse cursor but was nearly impossible to tap accurately on a tablet.
One centimeter is a minimum, not an ideal. Any interactive element that users need frequently or need to hit quickly should be larger. Apps that generate variable-sized targets based on data, like a health-tracking app showing nursing sessions as thin bars on a timeline, can accidentally create targets so small they’re unusable when the underlying data value is low.
Throughput and Device Comparison
One of the most practical applications of Fitts’s Law is comparing input devices on equal footing. The ISO 9241-411 standard (originally ISO 9241-9) defines a standardized pointing task, a multi-directional test where users select circular targets arranged in a ring, and uses throughput as the primary performance metric.
Throughput combines speed and accuracy into a single number measured in bits per second. A device that lets users be both fast and accurate scores higher. This makes it possible to objectively compare a standard mouse, a trackpad, a head-tracking device, or an eye-tracking system using the same scale. The formula adjusts for how accurately people actually perform, not just how quickly, so a device that encourages sloppy, fast movements doesn’t get an artificially high score.
This standardized approach has been used to evaluate assistive technologies, compare children’s performance with different input devices, and benchmark new interaction methods against established ones. The law’s strength is that it reduces the complex, messy reality of human pointing into a framework simple enough to make fair comparisons.

