14 April 2021

10 Reasons to peer: 3. peering lowers latency

Reason to peer 3

There are many applications and technology areas in which latency plays an important or even decisive role. When we surveyed our customers at DE-CIX Frankfurt last year, 81 percent of the participants took the position that latency is the most important criterion when concluding new interconnection contracts.

In this third article in our “reasons to peer” series, we look at how peering lowers latency.

The shorter the trip, the better the latency

Latency is the delay between a user’s action and the response to that action from a website or an application – in networking terms the total time it takes for a data packet to make a round-trip. It is measured in milliseconds, and Internet quality depends on it. For example, for a website, even a 2-second delay in the loading time is sufficient to increase the bounce rate more than 100 percent!

Peering paths outperform transit paths for 91 percent of Autonomous Systems (ASs), meaning that peering offers the shortest path for data to travel, and therefore better latency.

Control your traffic streams

Peering gives you the control over where your network exchanges traffic with other important networks. You control where to handover the traffic (which city/which Internet Exchange) and you have control over your backhaul and the peering port usage. As the other network also has this control, together with your peering partner, you have a controlled end-to-end handling of your valuable traffic streams.

Catch-up with the first two articles in the series: