How come in 2005, the television screen quality looked “so good” but in 2023 watching shows recorded back in 2005, the quality looks horrible.

846 views

The “best” TV you could buy in 2005, television shows specifically appeared to be “high quality” or “clear” and now if you watch a show that is directly uploaded from the producers (circumventing continuous compression from file transfers) you can tell it’s out dated just based on the quality of the show (aside from cultural cues such as fashion, vehicles etc)

tl’dr : I was watching T.C.A.P w/ Chris Hansen on TV when it was aired, watching it now I can tell how old it is based on the quality of the recording.

In: 878

37 Answers

Anonymous 0 Comments

In addition to the other excellent points, upscaling algorithms in tvs can also affect how good older movies/tv shows look. If an image was designed to be displayed on the first NTSC tvs with 480 scan lines at 30Hz, different upscaling methods for displaying it at 4k by will end up with different results – for instance, some algorithms might do a great job with sporting events (lots of green, fast moving ball), but a crappy job with an old Discovery Channel underwater documentary. Upscaling actually “creates” new pixels by interpolating what colors should fill in the blank spots created between the original pixels when you stretch the original image to cover a much larger canvas. The simplest way to do that would be to just repeat the color from the original pixel to fill the gaps, but that would look blotchy, so the solution is to try to create smooth transitions on the fly between colors and also to use intelligent processing that can recognize, “hey, that’s probably grass, here’s a gradient pattern that looks better at high resolution.” Sometimes that image processing makes bad choices and you end up with a pixelated or blotchy result instead of a believably enhanced image.

You are viewing 1 out of 37 answers, click here to view all answers.