Lag exists because data takes time to travel between your device and a server, and every step along that journey adds delay. Even under perfect conditions, the speed of light creates a hard floor on how fast information can move. Layer on top of that the processing time of routers, your game server’s update cycle, your monitor’s refresh rate, and the quirks of internet protocols, and you get the milliseconds of delay that separate your button press from what you see on screen.
The Speed of Light Is the Starting Point
Light travels fast, but not instantly. In a fiber optic cable, a signal moves at roughly two-thirds the speed of light in a vacuum, which means crossing the United States coast to coast takes about 40 milliseconds one way. A round trip doubles that. If you’re playing on a server in another continent, the physics alone can put you at 80 to 150 ms before any other factor enters the picture. This is called propagation delay, and no amount of better hardware can eliminate it. It is a consequence of living in a universe with a speed limit.
What Routers and Switches Add
Your data doesn’t fly in a straight line from your computer to its destination. It hops through a chain of routers and switches, and each one introduces two kinds of delay: processing time and queuing time.
Processing delay is the time a router spends inspecting each packet, figuring out where to send it, and applying any security or priority rules. Queuing delay happens when packets arrive faster than the device can forward them. They pile up in a buffer, waiting their turn. If multiple streams of traffic are all headed for the same output port at the same time, packets sit in line. Priority traffic (like a video call flagged as high importance) can push lower-priority packets even further back in the queue. During peak usage, these queues grow, and so does your lag.
Head-of-line blocking makes this worse in some router designs. If the first packet in a queue is waiting for a busy output, every packet behind it is stuck too, even if those packets are headed somewhere completely open. Modern routers use a technique called virtual output queuing to reduce this problem by maintaining separate lines for each destination, but the fundamental issue of too much traffic competing for limited capacity never fully disappears.
Server Tick Rate and Game Processing
In a multiplayer game, the server doesn’t process the world continuously. It updates in discrete steps called ticks. A typical competitive shooter runs at 64 or 128 ticks per second. At 64 ticks, the server has a 15.6 ms budget per tick to read every player’s input, simulate physics, resolve hit detection, and send updated positions back to all connected players. At 128 ticks, that budget shrinks to 7.8 ms.
If the server can’t finish all its work within that window, the next tick starts late and a chain reaction begins. Players experience this as rubber-banding (snapping back to a previous position), shots that don’t register, or characters teleporting instead of moving smoothly. Even if your internet connection is fast, a server struggling to keep up with its tick rate will feel laggy to everyone connected to it.
Your Monitor and Input Devices
Lag doesn’t stop at your network connection. Your display adds its own delay, split into two parts. Input lag is the total time between a signal arriving at the monitor and that image actually appearing on screen. Response time is how quickly individual pixels change color. A slow response time shows up as motion blur rather than traditional “lag,” but both affect how responsive the game feels.
At 60 Hz, your monitor can only show a new frame every 16.7 ms. At 144 Hz, a new frame arrives every 6.9 ms. If pixel response time is slower than the gap between frames, the display literally can’t finish drawing one frame before the next one is due. This is one reason competitive players chase high refresh rate monitors with fast response times: not because it fixes network lag, but because it removes one layer of delay between the game state and your eyes.
How Protocols Shape Delay
The rules your data follows when traveling across the internet also matter. TCP, the protocol used for web browsing and file downloads, prioritizes reliability. Before sending any data, it performs a three-way handshake that requires at least three trips between client and server. It checks for lost packets, resends them, and ensures everything arrives in order. When the application tries to send data faster than the connection allows, TCP queues the excess in a socket buffer, and end-to-end delay can spike dramatically.
UDP, the protocol most real-time games prefer, skips all of that. It can start transmitting data in a single network trip. It doesn’t resend lost packets or enforce ordering. In controlled experiments, isolated packets sent over UDP showed a median one-way latency about 4 microseconds faster than TCP. That gap is tiny on its own, but the real advantage is what happens under load: UDP doesn’t queue data the same way, so latency stays relatively stable even when the connection is busy. The tradeoff is that lost packets are simply gone, which is why games sometimes show a player teleporting instead of moving smoothly when packet loss is high.
Packet Loss and Jitter
Raw ping only tells part of the story. Two connections with the same average latency can feel completely different if one has stable packet delivery and the other doesn’t. Jitter is the variation in how long each packet takes to arrive. High jitter means some packets show up quickly and others are delayed, which forces the receiving end to either wait (adding lag) or play back data with gaps (causing stuttering).
When jitter gets bad enough, packets arrive out of order or get discarded entirely. Network devices sometimes try to compensate by buffering incoming data to smooth things out, but that buffering itself adds latency. This is why a Wi-Fi connection with 30 ms ping but high jitter can feel worse than a wired connection at 50 ms with rock-steady delivery.
What the Ping Numbers Actually Mean
Ping, measured in milliseconds, is the round-trip time for a small packet to reach a server and come back. Here’s how those numbers translate to experience in fast-paced games:
- 0 to 20 ms: Exceptional. Actions feel instant. This is the target for competitive play.
- 20 to 50 ms: Good. Smooth for both casual and competitive gaming, with no perceptible delay.
- 50 to 100 ms: Acceptable. Most games are playable, but you may notice slight delays in fast-paced situations.
- 100 to 150 ms: Tolerable. Delay becomes noticeable, especially when reaction time matters. Shots may feel like they miss when they shouldn’t.
- Above 150 ms: Poor. Significant rubber-banding, failed hit registration, and potential disconnections make real-time games frustrating.
How Servers and Software Fight Lag
Game developers can’t eliminate lag, but they have effective ways to hide it. The most important technique is client-side prediction: your game doesn’t wait for the server to confirm what happened when you press a key. Instead, it immediately simulates the expected result locally. When the server’s response arrives a few milliseconds later, the game quietly reconciles any differences. In cross-regional testing, client-side prediction reduced the perceived physics delay from over 570 ms to under 50 ms, cutting effective ping by more than half.
On the infrastructure side, edge servers and content delivery networks place computing resources physically closer to players. Research from the National Science Foundation found that 82% of users can reach a nearby edge server within 20 ms, compared to just 22 to 52% who get similar latency from major cloud providers. Edge servers offer lower latency to roughly 92 to 97% of users compared to traditional cloud data centers. In some regions, moving processing from a distant cloud to a nearby edge device saves 100 to 200 ms. This is why many multiplayer games now run regional servers rather than hosting everything from a single location.
Other tricks include interpolation, where the game smoothly animates between the last two known positions of other players rather than waiting for every update, and lag compensation on the server side, where the server rewinds time to check whether your shot would have hit based on what your screen showed at the moment you fired. None of these solutions remove the underlying delay. They just make it less visible, which is why a well-designed game at 80 ms ping can feel better than a poorly designed one at 40.

