Why does 480p video on a CRT look fine but a 480p video on a same-size modern monitor look blurry?

259 viewsOtherTechnology

I get the modern display will have more physical pixels, but since the pixels on the old CRT are just bigger, it too should look blurry shouldn’t it?

In: Technology

4 Answers

Anonymous 0 Comments

I’m not sure it’s exactly the proper term and issue at hand, but this [site ](https://www.geeksforgeeks.org/antialiasing/)gives good visualizations of the difference.

In essence, CRTs don’t have pixels at all, they have “smudges of color” vs the “sharp edges” of pixel based screens.

The difference is sort of like painting a picture vs. building one out of lego bricks. The painting has soft edges that aren’t very jarring but you are stuck with really wide and broad brush strokes. The lego bricks are going to have jarring blocky corners that are ugly. The solution with the lego bricks is to make tinier and tiner and tinier bricks so blocky corners become smaller and less apparent and you can ’round the edges’ with tinier, smaller blocks.

You might have limits with the painting in terms of making insanely tiny details which you just can’t do with a big paint brush, but overall the picture looks very smooth and pretty. With the lego bricks you *can* make insanely fine and small details assuming you have microscopic lego bricks.

So in short, if all you have are Duplo blocks, lego bricks aren’t a great choice for nuanced detail, go with the painting. But if you need the super fine detail AND you have microscopic bricks, then use the bricks.

Anonymous 0 Comments

So old CRT monitors had horizontal lines on the screen called “interlacing” which was caused by the way frames were painted on the screen. The image frame would have all the odd lines painted, and then all the even ones painted next. This improved the perceived frame rate and motion perception.

Think of painting a long wooden fence. Painting all the odd slats first means that, from a distance, you have an idea of what the final product looks like long before you do when you paint ever slat in order.

Well the interlacing left certain characteristics on the screen that would act similar to modern technologies that smooth the pixels. This allowed lower resolutions to “look better” as a result. The interlacing resulted in smoother transitions of one color pixel to the next one.

Here’s a video showing side-by-side comparisons of pixel art from old games on a modern monitor with sharp pixels vs. the same image with a CRT filter applied to it. In every case, the second image looks smoother and with less sharp edges as a result.

It’s pretty staggering the amount of extra detail that was able to be added by using the interlacing to your advantage.

Anonymous 0 Comments

The dots that shine on a CRT don’t correspond 1:1 to the pixels, or even the scan lines, of the image sent to it. The shadow mask has tiny holes that keep the three electron beams from hitting the wrong colors, but a single scan line is usually a few dots thick and with faded edges. This makes them look a lot softer and less pixelated than the hard-edged pixels of an LCD raster.
Edit: here’s an image, where you can see how very soft the shapes become: https://live.staticflickr.com/3358/3563978211_21b9e5e1fd_b.jpg

Anonymous 0 Comments

A CRT doesn’t inherently have pixels, it scans the electron beam across the screen in continuous lines that are broken up into pixels by the video card. What this means in practice is that a CRT is effectively running at its native resolution at all times, it never has to blend neighboring values together like a LCD does if the number of pixels of the current resolution doesn’t divide evenly into the native resolution. The scanlines themselves are also narrower than the pixels on an LCD, and this significantly impacts how things look.