Why is latency and bandwith separated? If latency is delay of data, and bandwith is data over time, shouldn’t one affect another? If it takes 5s for a car to start, and 10s to reach it’s destination, then the average speed would be spread out across the whole 15s. Should this not be the same?
There are some good explanations here but I’ll add another anyway because it provides a very concrete example.
When you connect a hose to the tap and open the valve, the time until water comes out the end of your hose is the latency. Short hose, short latency, long hose, long latency.
The thickness of the hose is your bandwidth. Thick hose, higher bandwidth, thinner hose, lower bandwidth.
Now, taking a real use case, if you want to fill up a swimming pool what type of hose should you use? Latency doesn’t really matter because you’re only going to incur it once when you turn on the hose. Bandwidth however will make a huge difference to how long the pool takes to fill. Double the bandwidth, halve the time.
On the other end of the scale, take a bathroom sink tap. You want to wash your hands. When you turn on the tap, you’d like water now (low latency)! You’re going to incur the latency multiple times a day and it could be a significant portion of total hand washing time if it’s not low. You don’t need much water though to wash your hands, so a relatively thin pipe (low bandwidth) is appropriate.