Because what’s actually happening, is the LEDs are being told one by one what colour they should be and at what time, and theirs many worker processes doing this – seeming as your screen is made up of millions of these little LEDs what we don’t want to have is a millions of processors trying to process every frame as it would be pointless and expensive. Instead the graphics processor splits the frame load up by its amount of LEDs it can handle simultaneously, and instead aims to get the entire frame done using this method. Now as the process takes a set amount of very small time to render, and our eyes are good at perceiving even these slight changes, we tend to notice it eventually. As you went from a black screen with no data, to a screen full of data, it’s more noticeable. Technically every time you render something onto a TV, you are replacing the screen data every time such that it replaces all the colours that’s needed – these rapid changes of colour however are not so easily noticed, especially when the entire screen doesn’t always change.
You mean [in a glitchy sort of way](https://youtu.be/fw5QnRevqX4)?
Modern TV signals are actually sent as compressed digital video, a bit like the video files you have on your computer.
The compression works by only sending the difference between frames. So if your signal cuts out and you lose a bunch of frames, when the signal comes back it’ll be starting with a bad frame and adding the difference to that.
Your TV manufacturer could have just shown black until it had received a complete good frame, but they decided it’s better to show something even if it’s corrupt.
Latest Answers