The “best” TV you could buy in 2005, television shows specifically appeared to be “high quality” or “clear” and now if you watch a show that is directly uploaded from the producers (circumventing continuous compression from file transfers) you can tell it’s out dated just based on the quality of the show (aside from cultural cues such as fashion, vehicles etc)
tl’dr : I was watching T.C.A.P w/ Chris Hansen on TV when it was aired, watching it now I can tell how old it is based on the quality of the recording.
In: 878
There are many reasons for this.
First, most people’s expectations of TV quality have shifted considerably as sets have improved. Even if you had the best quality set in 2005 (I did) it would look like garbage compared to a cheap Walmart special today.
Second, the quality in which something would have been recorded would be much lower in 2005, even if using state of the art equipment, vs today. It isn’t just pixels (but pixels are important) it is also stuff like gamut (range of colours) and dynamic range (how wide a range of brightness those are recorded in). Think of 16 bit vs 32 bit colour, for example. A lot of stuff was recorded on film and to get it on TV you needed to scan that film. You aren’t going to re-scan 2003 content using 2023 technology because it isn’t worth it.
Finally, some modern sets are not as good at “upscaling” lower quality content. As a result, a set with better resolution without good upscaling will make lower quality content actually look worse.
Latest Answers