How come in 2005, the television screen quality looked “so good” but in 2023 watching shows recorded back in 2005, the quality looks horrible.

1.11K views

The “best” TV you could buy in 2005, television shows specifically appeared to be “high quality” or “clear” and now if you watch a show that is directly uploaded from the producers (circumventing continuous compression from file transfers) you can tell it’s out dated just based on the quality of the show (aside from cultural cues such as fashion, vehicles etc)

tl’dr : I was watching T.C.A.P w/ Chris Hansen on TV when it was aired, watching it now I can tell how old it is based on the quality of the recording.

In: 878

37 Answers

Anonymous 0 Comments

It’s the magic of analog transmissions. No discrete packets of data. Images captured on film and projected as a continuous electrical signal on a CRT TV that has a raster in lieu of a pixel grid. Very often, it’s digitizing those pieces of media that makes them look horrible.

If you want to make an old show look good on modern screens, the worst thing to do is to simply convert the show on tape to a digital format. The authentic way is rescanning the film with a proper digital camera at extremely high resolutions (8K+). That’s what is done for all movie remasters for the sake of HD releases, and it’s what allows you to watch Star Wars Ep 4 without wanting to punch your fancy OLED.

Anonymous 0 Comments

One aspect of this is a product of the CRT technology at the time blended lines and graphics in a way that meant even though you didn’t have a high pixel count you got a better image than equivalent pixelated resolution.

[This also shows in video games from the era, perhaps even more starkly as the graphic designers of these games knew the technology they were working with and designed their sprites to utilize this effect](https://wackoid.com/game/10-pictures-that-show-why-crt-tvs-are-better-for-gaming/)

Anonymous 0 Comments

Tl;dr SD looks great in SD and like crap in HD

2005 was still the SD era. The HD transition wasn’t until 2009. Even then, many things made for television weren’t in full HD until later.

Several episodic TV shows from about 1990-2005 were shot on videotape. Those will never look “good” in HD formats because the data just isn’t there. On the flip side shows that shot on film, the prime example is Star Trek The Next Generation, can scan the films into an HD format.

Someone else may be able to find it, but there was a
picture a while back of a sprite on an SD game that looks great when viewed in an SD resolution, but looks like chonky garbage when upscaled to HD.

Edit: because apparently people are getting hung up on the date. June 2009 was when OTA broadcasts ceased being in SD. Yes, live sporting events were in HD but a lot of, especially cheaper, shows shot on videotape. There is no HD version of those without upscaling.

One other point: videotape and film have a “look,” which many modern digital systems try to emulate. It really boils down to chemistry. But if the show you’re watching was shot (or mastered) in SD and shot on a physical medium, it’ll have a look to it (unless treated in post). It’s the same reason “Polaroid” has a look.

Anonymous 0 Comments

A big aspect of this is you had no experience of HD or 4K at that point. You only watched SD and HD was about to really kick off. So you were used to SD being the cutting edge and better than the 90s and 80s or whatever.

Then you saw 1080p HD, and suddenly you look back on SD content and notice it actually sucks. Now you know HD can exist and you experienced the upgrade, SD now is bad by comparison.

Then go to 1440p or 4K or even beyond… and suddenly HD sucks and you notice HD wasnt that good either. And so on.

Go drive a Ferarri and then go back to your Camry or whatever… your Camry was fine, you loved it… but then you experienced a real amazing driving experience that set a new standard for you.

There are also differences in CRT tv and modern LCD or plasma or OLED… that can be a factor too

Anonymous 0 Comments

This depends. 2005 had HD broadcasting but it was 1080i or 720p. Blu-ray came out in 2006… unless you were watching something that was recorded with old NTSC analog video, it doesn’t make sense. Now if you were talking about old tv shows from 80s and 90s, then I’d agree.

How are you “watching it now”. What is the source. Did you record it? is it a youtube recording?

Anonymous 0 Comments

I’ll try an actual ELI5 answer:

Modern recordings are made with more pixels because we have better cameras and storage now.

Older recordings were made on film, and light has lots and lots of effective pixels because it doesn’t exactly use pixels in the first place.

In the late 90s through the aughts, TV producers started moving from film to digital but the technology didn’t make as many pixels then as it does now.

If the master recording has few pixels, it’s not easy to increase the number of pixels because the detail information is gone.

So there is an awkward period between high-resolution film and high-resolution digital where the recordings were low-resolution digital (by today’s standards) and because there is no film to re-scan with newer digital technology, there is no practical way to create a higher-resolution version.

Anonymous 0 Comments

In addition to the other excellent points, upscaling algorithms in tvs can also affect how good older movies/tv shows look. If an image was designed to be displayed on the first NTSC tvs with 480 scan lines at 30Hz, different upscaling methods for displaying it at 4k by will end up with different results – for instance, some algorithms might do a great job with sporting events (lots of green, fast moving ball), but a crappy job with an old Discovery Channel underwater documentary. Upscaling actually “creates” new pixels by interpolating what colors should fill in the blank spots created between the original pixels when you stretch the original image to cover a much larger canvas. The simplest way to do that would be to just repeat the color from the original pixel to fill the gaps, but that would look blotchy, so the solution is to try to create smooth transitions on the fly between colors and also to use intelligent processing that can recognize, “hey, that’s probably grass, here’s a gradient pattern that looks better at high resolution.” Sometimes that image processing makes bad choices and you end up with a pixelated or blotchy result instead of a believably enhanced image.

Anonymous 0 Comments

One possible answer is the TV you are watching it on in 2023 is a lot bigger than the TV you were watching it on in 2005. A bigger screen makes quality more obvious. Or if you’re watching it on a computer screen, you’re sitting a lot closer.

Anonymous 0 Comments

its mainly due to the magic of CRT.
Highly simplified, the pixels on a crt bleed into each other, giving a nice smooth effect and color.
While a modern lcd can only give a crisp sharp images.
And while it looks nice, the CRT bleed is easier on the eyes.

Anonymous 0 Comments

Because they weren’t actually all that good. You can still get gear from that era, feel free to build a setup and try.

There’s also the image scaling. Make sure to use integer scaling when playing content back on a modern display. Avoids blurring it as its being stretched to fit.