The analog television signals use brief voltage dips, to signal the horizontal sync and vertical sync of the image. It’s very cheap and easy to use simple ’80s and ’90s circuits to detect the sync pulses and then precisely time a semiconductor switch to choose: whether the original analog video signal or digitally-generated analog color signal will be displayed at any part of the screen.
Everything is analog at the end of the day – sound, light, electric impulses, etc.
“Digitizing” something is a metaphysical fantasy adventure. It’s an idea that is used to transmit information with less error and/or with greater throughput, but it isn’t real. It’s a different perspective on how to interpret information.
In real life there’s fuzziness and ambiguity in everything. The way that a “digital” system overlays information on top of another set of information is fundamentally, on the inside, the same way that a CRT would overlay analog information over another analog signal. They are the same thing from a purely physical standpoint, just with different levels of abstraction.
Latest Answers