What is Network Round-Trip Delay?
Network Round Trip Delay Impact Application Performance?
Round trip delay (RTD), a.k.a. Round trip time (RTT), or ping time, is the duration of time it takes for a signal to reach its destination across a network and then for an acknowledgment signal to return to the originator. This measurement includes all delays caused by routing, transmission medium, and network traffic. The return signal does not have to follow the same path that the initial signal took.
How do you measure Network Round-Trip Delay?
Using standard ping tools found on most computers, network round trip delay can be tested easily from the command line. The ping time or ping rate is typically measured in milliseconds (ms). The standard ping is measured for either Round Trip Time (RTT) or Time to First Byte (TTFB). RTT is the total time it takes for a data packet to go from the client and return from the server, and it is the standard reporting measurement. TTFB is the time it takes for the server to get the first byte of client data, and should be half the time as RTT on the same path.
Application Performance and Network Round Trip Delay
Impact of RTT on Application Performance
Round trip delay is closely related to network latency. RTD is the measure on which network latency and network jitter are calculated, and because of this, it shares the same impacts that latency and jitter have on quality of service (QoS). As RTD increases so does network latency, and as the variance between RTDs in a transmission increases so does the jitter.
What are the factors that impact Round-Trip Delay (RTD)?
- Transmission Medium—Network traffic speed is limited by the physical connections that it travels through. Because a signal can only travel as fast as the medium of transport allows, understanding the slowest and fastest mediums along a signals route becomes crucial in controlling RTD. For example, fiber optic cable is roughly 100x faster than coax cable, and over greater distances mostly fiber optic is used. However, leading into most homes, some variation of metal cable is used for the internet. This means, at these points of medium change, we can expect there to be transmission delays as signal slows down to when passing from fiber optic to travel across the copper medium. Further, inside the home or office, the signal may then be sent to computers via wifi, changing medium once again.
- Local Area Network Traffic—Local network traffic causes congestion which can bottleneck the network. This directly impacts both transmitting and receiving signals from connected devices. For example, the use of streaming services by multiple devices on the same network will cause congestion, but for other users trying to send and receive transmissions they may experience increases in RTD.
- Server Response Time—On the opposite side of a transmission from the requesting user is the server response. How quickly a server can respond to a request will impact the RTD. In fact, this is a classic attack by cybercriminals, known as a denial-of-service attack (DoS) when a server is flooded by requests in order to overload it and either stall access or deny it completely. Based on this example, the main principle for servers is the number of requests it must attend to, the greater the amount, the more likely it will impact RTD.
- Internet Routing And Congestion—Perhaps the aspect affecting round trip delay that is least controlled by IT departments is the routing of a signal across the internet. A signal travels from sender to receiver and back through a route that will pass through any number of nodes along the internet. Each node has its own network traffic that the signal must pass through, which factors into the RTD calculation. Generally the greater number of nodes that the signal travels through the more congestion it must contend with and ultimately the longer the signal will take.
- Physical Distance Of Transmission—Closely related to the transmission medium, the physical distance a signal must travel to reach its destination is limited by the laws of physics and the speed of light. In most long distance transmissions, fiber optics use light to send massive amounts of data efficiently, but even that is not instantaneous.
Related Terms
Network Latency
Network latency is the duration of time it takes a data packet to travel from its source to its destination across a network. In terms of user experience, network latency translates to how fast a user’s action produces a response from a network, say how quick a web page accesses and loads over the internet, or the responsiveness of an online game to the gamer’s commands.