Why is latency and bandwith separated? If latency is delay of data, and bandwith is data over time, shouldn’t one affect another? If it takes 5s for a car to start, and 10s to reach it’s destination, then the average speed would be spread out across the whole 15s. Should this not be the same?


Why is latency and bandwith separated? If latency is delay of data, and bandwith is data over time, shouldn’t one affect another? If it takes 5s for a car to start, and 10s to reach it’s destination, then the average speed would be spread out across the whole 15s. Should this not be the same?

In: Technology

8 Answers

Anonymous 0 Comments

Consider conventional mail (via a post office). This a high latency, high bandwidth type of communications. It takes a few days to get a shipment somewhere, but that shipment can contain almost unlimited amounts of data.

Contrast this with SMS (text messaging). This is fairly low latency – your text message arrives almost immediately – but low bandwidth (you can’t send all that much data).

Another way to consider this:

Latency is *responsiveness*. It’s how long it takes to get a reply once I send a message.

Bandwidth is *volume*. It’s how much information I can send over time.

Anonymous 0 Comments

There are some good explanations here but I’ll add another anyway because it provides a very concrete example.

When you connect a hose to the tap and open the valve, the time until water comes out the end of your hose is the latency. Short hose, short latency, long hose, long latency.

The thickness of the hose is your bandwidth. Thick hose, higher bandwidth, thinner hose, lower bandwidth.

Now, taking a real use case, if you want to fill up a swimming pool what type of hose should you use? Latency doesn’t really matter because you’re only going to incur it once when you turn on the hose. Bandwidth however will make a huge difference to how long the pool takes to fill. Double the bandwidth, halve the time.

On the other end of the scale, take a bathroom sink tap. You want to wash your hands. When you turn on the tap, you’d like water now (low latency)! You’re going to incur the latency multiple times a day and it could be a significant portion of total hand washing time if it’s not low. You don’t need much water though to wash your hands, so a relatively thin pipe (low bandwidth) is appropriate.

Anonymous 0 Comments

Flash has both slow latency (time to initial response) as well as low bandwidth (slow transmission of message)

Anonymous 0 Comments

It depends on what you need. If youre designing a car to drive 100 miles, then the speed of the car is more relevant. If you’re designing a car that only moves 5 feet, then it’ll spend most of it’s time starting up, so if you need it to make that trip faster, that’s the time you need to improve.

Anonymous 0 Comments

You can also think it doing a river analogy. Imagine you measure the average speed of the water droplets.. that will give you the latency from one point to another. Now measure of many droplets pass a “line” on the river during a certain amount of time.. Thats the bandwidth.

Anonymous 0 Comments

Lets look at two extremes here. Morse code flashed with lights vs IPoAC

If you’ve got two ships near each other you can send a message by flashing morse code via lights. The time it takes for each flash to reach the other ship is extraordinarily short, nanoseconds or microseconds at most, but because you have to manually flash the light on and off your bandwidth is quite low so it can take minutes to send a sentence. This is ultra low latency but also ultra low bandwidth.

The other end of the spectrum is IPoAC or Internet Protocol over Avian Carrier(my favorite implementation of IP). It involves loading all your data onto an SD card, strapping it to the leg of a carrier pigeon and sending it on its way. It’ll take hours or even days for your data to arrive, but you can send a terabyte this way pretty easily(but send it twice because predators=packet loss). This is ultra high latency and ultra high bandwidth.

If you’re sending a short message then you can do it wayyyy faster by blinking the lights, but if you’re sending a long message then the low bandwidth of the blinking lights would let you catch up using the carrier pigeon even though it needs a couple hours.

To take your car analogy, it takes 5 seconds for a car to start and 10 seconds to reach its destination is fine for sending a single car to the destination. But what if i want to move a bunch? Well then i can spend 10 seconds starting the truck that’ll take 20 seconds to get to the destination but carry 8 cars there, that’s a much faster way to move cars from A to B even though the Latency is a lot higher, but if you’re only moving a single car it’ll be slower

The connection you want depends on your needs.

Anonymous 0 Comments

Sending data is actually a two way communication. Computer 1 sends a package of data to computer 2. Computer 2 then sends an acknowledgement to computer 1 to say that it’s received the package of data, and it’s ready for the next. Latency is a measure of how fast this round trip happens. Bandwidth is a measure of how much data can physically be shoved down a cable in a set amount of time.

A high latency can mean computer 1 can’t send the data as fast as its capable of because it’s spending a long time waiting for acknowledgements from computer 2.

Anonymous 0 Comments

A good, real world way to understand it would be to look at satellite internet, using geostationary communication satellites, 36,000 km out in space. With a big dish and an expensive plan, you can get high bandwidth out of this. Gigabytes of data send up and down.

But if you send a message, you can’t get around the fact that it takes 0.12 seconds for the signal to get from your dish to the satellite, another 0.12 seconds for it to get back down to the base station, and the reply to your message takes 1.2 seconds to get back up to the satellite and 1.2 to get back down. If all the other stuff with your local router, the satellite hardware handling your data, the terestrial link to and from the server, and the sever servicing your request happens instantly, you still have 0.5 seconds of lag. In practice, lag usually pushes out to a second.

But because your bandwidth is high, it when that message comes, you it can contain lots of data. There can be a 20Gb/s stream of data that comes in, one packet after another. And, yup, that means there will be 5Gb of data ‘in flight’ between the ground station and you at any time. If you sent a message that said ‘stop now’, there’d be a 0.5 second delay, and 10Gb of data.

This is fine if you are streaming a movie, or downloading a large file. But web pages often use many different parts, and lots of these parts tell your browser to download other parts. This costs you lots of those second-long delays – meaning that a custom proxy is essential, one that renders the page at the ground station and sends all the page data in one stream. Phone services are annoying, but workable at a pinch. Gaming is right out.

So this explains how a service can have high bandwidth but low latency. The two features of a connection are quite separate.