I don’t quite understand the difference in the technology. An example I’m thinking of is that if you’re streaming a TV program online that has a ton of simultaneous viewership (say, the Super Bowl), you’re liable to run into bandwidth issues—buffering, freezes, pixelated video, etcetera, as the servers are taxed with keeping up with the demand of all the viewers.
The same thing doesn’t happen with cable tv, even though it is also a digital signal coming over a wire. No matter how many people are watching, the cable tv provider is able to put out perfect quality video to everyone with no buffer time.
What is the difference in the tech that allows cable tv to do this but not online streaming video?
In: Technology
Cable throughput is fixed.
Say you always know the resolution and frame rate of a video stream, that’s what cable is. There is no need for elasticity.
An internet stream is variable, you can change resolution and sometimes the framerate. If a ton of people suddenly request the stream it causes strain on the providing servers, whereas a cable “broadcast” is always pushing the same signal.
Cable TV is broadcasting the same thing to everyone, and if you miss it, either you had DVR recording it or you missed it (tho newer cable TV is basically just internet streaming). So it’s more like one stream of data being copied to everyone.
Netflix can be sending “series A episode 5 at 4:00” to one person, “series A episode 5 at 5:00” to a different person, while sending “series B episode 25 at 23:00” to someone else. People can skip forward and backward and immediately switch to a different series or episode. So there are multiple different data streams sent do individuals.
Services like YT have orders of magnitude more videos with much more users at different parts of the video. ISPs are often also Cable TV providers. Cable TV always gets priority to guarantee quality. ISPs sometimes drops internet bandwidth for too many users. Cable TV can drop bandwidth too, but it’s very rare
Cable TV just broadcast the signal out. Its running on the line 24/7, and sometimes you tune your tv in to watch it.
Imagine one guy with a megaphone just yelling out. It doesn’t really matter if one person, or 1,000 are listening.
The internet requires two computers to communicate back and forth, sometimes packets are dropped, and need to be resent. More viewers means more work for the server, and if too many try to connect at one time, the server can get overwhelmed.
Is not a lot of modern “cable TV” now IP-based, though? So more resembles streaming point-to-point, not broadcast. So then perhaps the answer is that since your TV provider controls the “network” (IP over coax), and has the “servers” that do the streaming, they can be more reliable than streaming coming over other networks.
I think you are getting a lot of partial answers that are possibly not answering your question.
Cable TV continuously broadcasts and each channel has its own dedicated frequency and bandwidth and is delivered over its own infrastructure. in traditional cable a connection is made between you and the cable company and the each channel has a frequency and bandwidth and the only data in that connection is the cable TV content.
It’s dedicated so it’s designed to always be able to have the bandwidth to deliver what you are watching. It’s also independent of the number of viewers that are watching. An ELI5 analogy might be that if you watch a movie theater person it doesn’t matter how many people there are the movie looks the same.
Streaming is sent over a shared Internet connection that is not dedicated and a stream is being sent just to you. You requested the data and you can pause or fast forward or rewind. It’s not dedicated and the bandwidth is dependent on the network fabric between you and the provider. To extend the ELI5 analogy of the movie theater. Everytime somebody watches a stream it’d be like the movie theater needs to set up a private room with a VCR and play the movie just for you. There are limits to how many rooms that can be created.
There are hybrid delivery models that blend the two as well.
TLDR: CableTV is sent over a dedicated line and is number of viewers agnostic. Streaming is not sent over a dedicated line and has a maximum number of viewers.
> if you’re streaming a TV program online that has a ton of simultaneous viewership (say, the Super Bowl), you’re liable to run into bandwidth issues—buffering, freezes, pixelated video, etcetera, **as the servers are taxed** with keeping up with the demand of all the viewers
This is part of the reason, but they usually mitigate it by setting up additional servers in case of an increase in demand. The other reason (and probably the main one) is your home internet connection: while it’s advertised as (let’s say) 200 Mbps, you’re not paying for a dedicated connection, i.e. when you are guaranteed to have that speed. They deliver “up to 200 Mbps”, but if multiple people want to watch the game, then the network can’t cope with the demand, and everyone gets a worse speed.
To use numbers (just as an example), let’s say they have 10 clients in your street at 200 Mbps each, they might get away with only using a 1000 Mbps trunk cable, because not everyone will be using their entire 200 Mbps alloted at the same time.
Latest Answers