How come in 2005, the television screen quality looked “so good” but in 2023 watching shows recorded back in 2005, the quality looks horrible.

838 views

The “best” TV you could buy in 2005, television shows specifically appeared to be “high quality” or “clear” and now if you watch a show that is directly uploaded from the producers (circumventing continuous compression from file transfers) you can tell it’s out dated just based on the quality of the show (aside from cultural cues such as fashion, vehicles etc)

tl’dr : I was watching T.C.A.P w/ Chris Hansen on TV when it was aired, watching it now I can tell how old it is based on the quality of the recording.

In: 878

37 Answers

Anonymous 0 Comments

When a screen is designed for a certain maximum density of pixels, that’s the minimum distance between pixels it can barely separate. It’s a bit blurry – causing a pleasant ‘fuzz’ around the edges of pixels.

That invokes an optical illusion in the brain. Your brain is used to dealing with cases where image resolution isn’t great, and it fills in the missing information with a good guess when things look a bit unclear. This makes images appear clearer than they really are when your brain sees the pixels fuzz around the edges a bit and it invokes its smoothing algorithms to fill in missing info.

But when you take the same low res image and display it on a screen that is capable of higher res images, that screen doesn’t show those pixels as having fuzzy edges. It shows them as very precisely defined very obviously rectangular chunks because it’s displaying them on a display capable of showing tighter pixels. It’s rendering that “1” pixel using a 2×2 grid of its minimum pixel size, or a 4×4 grid. So your brain no longer is being shown an image with fuzz around the edges of things. It’s seeing the actual jaggy stair-step chunky “lines” that diagonal lines make.

Fuzzy pixels hide the stair-step effect of diagonal edges. Precise squared off pixels don’t.

Anonymous 0 Comments

I think part of it might also be perception. When PlayStation first came out I thought we had reached the Pinnacle of game graphics and if they got any better they’d be making movies. Looking back having been exposed to better graphics now I can see how bad they were. That’s probably not the answer you were looking for, but you got that one already so cheers

Anonymous 0 Comments

How come the pizza at Chuck E. Cheese’s tasted great when I was eight, but now it tastes like ketchup and cardboard?

Anonymous 0 Comments

additional quirk of media that was supposed to be played in CRTs is interlacing.

they combine two frames of video into one, “lacing” them together into alternating lines.

the right way to display these frames is to render the two halves sequentially, which was done with CRTs.

but when you convert these into digital, they’re just displayed as two frames frankensteined together.

Anonymous 0 Comments

So much as been lost but noticing this kind of stuff makes me appreciate so much more that so many people through history valued quality and preservation.

Anonymous 0 Comments

So, basically, old TV shows were like watching magic on a potato, but now we’ve upgraded to crystal-clear wizardry!

Anonymous 0 Comments

Similar issue with old games. They look terrible when they’re crystal clear, but on an old TV that they were always meant to be played on they look amazing.

It wasn’t just nostalgia and rose-tinted glasses, they actually did look better back then.