why do online broadcast suffer in quality when many people watch them, but TV broadcasts are unaffected?

167 views

For example, watching a paid soccer online broadcast – if a lot of people watch it then the quality will drop and it will start buffering and lagging. But that never happens with a soccer TV broadcast no matter how many people watch it. Why is that?

In: 0

4 Answers

Anonymous 0 Comments

When you broadcast a TV show, the signal is basically sent to everyone and everyone with a TV can choose to listen to the signal or ignore it. This is easiest to picture with an over the air broadcast, but cable works in a similar way. When you have a TV station, you’re broadcasting it from an antenna, so anyone who that signal reaches can turn on their TV and tune to that channel to watch it.

The way online video works is that, if you’re watching something, the video signal is being sent to you specifically. If 1000 people are watching the same online stream, then the streaming provider is sending 1000 video streams out. As more people start watching your stream, it becomes harder for the service to send video to all those people, so they might have to start sending a lower quality version of the video to make it easier.

Anonymous 0 Comments

TV is based on broadcast technology. You cast the single signal broadly to anyone with the capability to receive it. So the transmitter only sends the single signal no matter how many people watch it. Internet is inherently unicast. The server cast the signal to one single client. If two people are watching the same channel then the server will send out two signals, one to each client. That means that when many people watch the servers need a lot of bandwidth.

There are ways of solving this. Internet was developed with multicast functionality. This is a feature where anyone can subscribe to a signal and the routers would make sure to split the signal to every subscriber. But this was hard to implement so it was not initially working. It is just fairly recently that ISPs have started implementing it due to the issues with TV broadcasts over unicast. However it is still very limited and usually does not work between ISPs. Adding to this most people today use video on demand extensively, if only because they want to pause the live transmission for a bit or that they arrived a few minutes late for the opening. Multicast (and broadcast) only work when everyone is watching the exact same thing so when people are in different parts of the video then it have to revert back to unicast anyway. So the benefits of multicast is not easy to gain.

There are content delivery networks who implement something similar. They send servers to the ISPs to install fairly close to the clients. The client would automatically find the closest of these servers and get the signal from this server. When multiple people watch the same broadcast from the same server then only one set of the signal will be sent to it saving some upstream bandwidth. In addition these servers usually caches the data for some time which allows video on demand clients to retrieve the data from the local server. This is still not a perfect solution but it is fairly good.

Anonymous 0 Comments

Online broadcasts aren’t actually broadcasts

For real TV its a many to one broadcast. The TV station sends out the signal at full power and there’s no change to it if a million people tune in versus no one because its sending the signal to *everyone* regardless of if they tune to that channel

Online broadcasts are many one to one transmissions. The server has no way to just punt the bits out into the ether for anyone to find, that’s not how the internet works. If you want to watch the broadcast your computer has to talk to the server and request the packets. The server has to send the same packets to each and every connection so if there is 1000 people watching instead of just 1 the server needs to do about 1000x as much work.

Generally the servers are pretty strong but for really popular events or relatively low end streaming services the viewership can start overwhelming what the server can handle so everyone starts getting fewer packets (laggy/stuttering video) or smaller packets (lower resolution) so that the server can keep up with getting *something* to *everyone*

Anonymous 0 Comments

Imagine money being thrown away from a helicopter. You stand in crowd and catch it – that’s the TV broadcast. Now imagine that they had to bring in by truck. A big crowd and jam will form and it is likely you will get less and later. You can send multiple trucks, but it will cost more. On the other hand truck driver will be physically able to give you money in another currency, if you specifically ask so, or even act as a meeting point for you to exchange money with somebody else.

In other words, catching TV signals is easier. You only need a one device, a TV. However with internet connection it’s more difficult. Besides the receiving device, you need a more complicated infrastructure and its upkeep to process and re-transmit signal (e.g. look at the density of cell towers). With it comes the extra benefit internet provides: unlike with TV, you not only receive the signal, but also can send one yourself, and not just to the provider of whatever you are getting, but to anybody else (and receive back from them). It is a great empowering improvement, but complexity of its implementation results in you not getting a free omnipresent source – you have to pay for the internet yourself, or find a free wifi (meaning somebody else pays and graciously shares), while the connection capacity will still be limited. If too many people start using this source at the same time, expect you will get less and later.