Understanding Database Latency
Latency is basically the amount of time you must wait to get a response from a request. It is measured at a specific component like a storage or network device. When you ask a computer to do something, each component involved in that request has a minimum amount of time it takes to reply with an answer even if the answer is a null value. If your database requests a single block from storage, then the time it takes to receive that block is storage latency. If the storage is attached over a network like Fibre Channel or SAS, then the network interfaces and cables each add their own latency. Latencies are often measured in milliseconds (ms). There are 1,000 milliseconds per second, and most people cannot perceive anything smaller than 1 millisecond. However, computers are wicked fast and the latency may need to be measured in microseconds (each μs is a millionth of a second) or even nanoseconds (each ns is a billionth of a second). Hard dr...