I don’t quite understand the difference in the technology. An example I’m thinking of is that if you’re streaming a TV program online that has a ton of simultaneous viewership (say, the Super Bowl), you’re liable to run into bandwidth issues—buffering, freezes, pixelated video, etcetera, as the servers are taxed with keeping up with the demand of all the viewers.
The same thing doesn’t happen with cable tv, even though it is also a digital signal coming over a wire. No matter how many people are watching, the cable tv provider is able to put out perfect quality video to everyone with no buffer time.
What is the difference in the tech that allows cable tv to do this but not online streaming video?
In: Technology
> if you’re streaming a TV program online that has a ton of simultaneous viewership (say, the Super Bowl), you’re liable to run into bandwidth issues—buffering, freezes, pixelated video, etcetera, **as the servers are taxed** with keeping up with the demand of all the viewers
This is part of the reason, but they usually mitigate it by setting up additional servers in case of an increase in demand. The other reason (and probably the main one) is your home internet connection: while it’s advertised as (let’s say) 200 Mbps, you’re not paying for a dedicated connection, i.e. when you are guaranteed to have that speed. They deliver “up to 200 Mbps”, but if multiple people want to watch the game, then the network can’t cope with the demand, and everyone gets a worse speed.
To use numbers (just as an example), let’s say they have 10 clients in your street at 200 Mbps each, they might get away with only using a 1000 Mbps trunk cable, because not everyone will be using their entire 200 Mbps alloted at the same time.
Latest Answers