Why the hertz of the vast majority of device screens are at 60, why not choose a rounder number like 100?

289 views

I have always seen how the screens of the devices are closed at 60 or 59.95, I do not understand why the standard is 60 and not a rounder and more manageable number to make comparisons like 100hz, I understand why all the movies are at 24 hertz but I do not understand why they always it’s about 60hz when you find a screen in the wild, is because costs? I mean, on gaming the standar is going up
since the last years but the standar is still 60

In: 0

7 Answers

Anonymous 0 Comments

NTSC TV use 60 Hz, more exactly 30 fps interlaced.

PAL used 50 Hz, 25 fps interlaced.

It is not a coincidence that US power grid frequency is 60 Hz compared to 50 in Europe. You do what to have the frame rate in sync with the main frequcy because of electrical interference and flicker from lights that run on AC. Eary cameras used the frequency of the power grid for synchronisation

When color was added to NTSC the colors signal is encoded in a way so old black and white TV would still work. There was some problem with a pattern with dots that could emerge so a small change to 59.94 Hz resulted in them moving around and not being easy to see.

PAL encodes color in another way, it is developed later and avoid that problem NTSC had it could keep 50 HZ.

50/60 HZ is around the lowest refers rate possible where humans do not see flicker. Movies avoid this by showing each frame multiple times. The shutter in the projector dims the light two or three times for each frame and the higher flicker rate even if the displayed image is identical results in humans not noticing it.

So 60 HZ lives one because of is history. It is often good enough. A 100 Hz display will cost more and you need more GPU power to read at a higher frequency. So 60 live on as a lower limit.

You are viewing 1 out of 7 answers, click here to view all answers.