Latency is the amount of time a message takes to traverse a system.
In a computer network, it is an expression of how much time it takes for a packet of data to get from one designated point to another. It is sometimes measured as the time required for a packet to be returned to its sender.
Latency depends on the speed of the transmission medium (e.g., copper wire, optical fiber or radio waves) and the delays in the transmission by devices along the way (e.g., routers and modems). A low latency indicates a high network efficiency.
Latency and throughput are the two most fundamental measures of network performance. They are closely related, but whereas latency measures the amount of time between the start of an action and its completion, throughput is the total number of such actions that occur in a given amount of time.
Sending data in large packets has a higher throughput than sending the same data in small packets both because of the smaller number of packet headers and because of reduced startup and queuing latency. If the data is streamed (i.e., sent in a continuous flow), propagation latency has little effect on throughput, but if the system waits for an acknowledgment after each packet before sending the next, the resulting high propagation latency will greatly reduce throughput.
Latency is also an important consideration with regard to other aspects of computers, particularly where real time (i.e., nearly instantaneous) response is required. For example, in some Internet games, a high latency (also called lag) can add to the difficulty of determining which player performed an action first (such as shooting an opponent or answering a question). In playing computer-based musical instruments, latencies greater than 100 milliseconds make it difficult for players to get the nearly instantaneous feedback that they require.
The ping and traceroute commands are widely used to identify latency problems on networks.
Created September 20, 2005.