The “best” TV you could buy in 2005, television shows specifically appeared to be “high quality” or “clear” and now if you watch a show that is directly uploaded from the producers (circumventing continuous compression from file transfers) you can tell it’s out dated just based on the quality of the show (aside from cultural cues such as fashion, vehicles etc)
tl’dr : I was watching T.C.A.P w/ Chris Hansen on TV when it was aired, watching it now I can tell how old it is based on the quality of the recording.
In: 878
When a screen is designed for a certain maximum density of pixels, that’s the minimum distance between pixels it can barely separate. It’s a bit blurry – causing a pleasant ‘fuzz’ around the edges of pixels.
That invokes an optical illusion in the brain. Your brain is used to dealing with cases where image resolution isn’t great, and it fills in the missing information with a good guess when things look a bit unclear. This makes images appear clearer than they really are when your brain sees the pixels fuzz around the edges a bit and it invokes its smoothing algorithms to fill in missing info.
But when you take the same low res image and display it on a screen that is capable of higher res images, that screen doesn’t show those pixels as having fuzzy edges. It shows them as very precisely defined very obviously rectangular chunks because it’s displaying them on a display capable of showing tighter pixels. It’s rendering that “1” pixel using a 2×2 grid of its minimum pixel size, or a 4×4 grid. So your brain no longer is being shown an image with fuzz around the edges of things. It’s seeing the actual jaggy stair-step chunky “lines” that diagonal lines make.
Fuzzy pixels hide the stair-step effect of diagonal edges. Precise squared off pixels don’t.
I think part of it might also be perception. When PlayStation first came out I thought we had reached the Pinnacle of game graphics and if they got any better they’d be making movies. Looking back having been exposed to better graphics now I can see how bad they were. That’s probably not the answer you were looking for, but you got that one already so cheers
additional quirk of media that was supposed to be played in CRTs is interlacing.
they combine two frames of video into one, “lacing” them together into alternating lines.
the right way to display these frames is to render the two halves sequentially, which was done with CRTs.
but when you convert these into digital, they’re just displayed as two frames frankensteined together.
Latest Answers