“The connectivity sort of festoons around the continent—it’s all structured to backhaul that traffic to Europe where traffic gets exchanged. And that works. The big penalty you pay there is the distance. That latency for that traffic to go back and forth.”
This is how Principal Analyst Erik Kreifeldt describes the movement of the internet through internet exchange points, connecting people across continents.
And when data must travel far and wide to reach a user, you see that L word. Latency.
So. What is latency?
All networks experience some amount of latency—and there are technically different kinds. Think of it as the delay. It’s the time between hitting enter and actually getting to your desired destination on the internet. In networking speak, it’s the time it takes for a data packet to go from one node to another. For you, it’s the furious few seconds you wait while refreshing HBO Go on a Sunday night, anticipating a new Game of Thrones episode at any moment.
If the internet gods are on your side, you might not notice the space between requesting a site and seeing the page load in your browser. But if the data packet in question needs to take a particularly circuitous route—or if data demand is particularly high—we’re more likely to notice a pronounced delay before reaching our intended destination on the internet.
And there’s more than a few reasons for how much latency you experience. This ranges from the type of internet connection used, to spikes in traffic, to wireless interference.