Why the hertz of the vast majority of device screens are at 60, why not choose a rounder number like 100?

275 views

I have always seen how the screens of the devices are closed at 60 or 59.95, I do not understand why the standard is 60 and not a rounder and more manageable number to make comparisons like 100hz, I understand why all the movies are at 24 hertz but I do not understand why they always it’s about 60hz when you find a screen in the wild, is because costs? I mean, on gaming the standar is going up
since the last years but the standar is still 60

In: 0

7 Answers

Anonymous 0 Comments

In America our Electricity cycles at approx 60Hz, so I would assume since most frame rates are some multiple of 60 in the US (But it is often 50hz on analog tv in other parts of the world which also corresponds to their ac electrical cycle hz) that those two numbers are directly related.

I think this originally stems from the way crt tvs work and may be less relevant in the digital age and more of a byproduct of the past which is why we now get TVs at higher frame rates.

Anonymous 0 Comments

I think it comes from the mains frequency. In former times, when precision timing equipment was expensive or very cumbersome, it was an obvious choice to sync video display to the mains frequency. Somehow, this persisted until today and was never changed, even though it does not make much sense anymore.

Anonymous 0 Comments

It’s because in the US the AC electricity is 60Hz and that is something easy to build electronics to sync up with. This is the reason most TV’s are 60HZ, or 30fps if interlaced.

Fun fact the initial reason 60 Hz was chosen by Nikola Tesla was because AC motors would become inefficient at higher frequencies, and at lower frequencies would require too much iron, additionally lights would visibly flicker at a lower Hz.
Source: https://teslauniverse.com/nikola-tesla/articles/life-and-times-nikola-tesla#:~:text=The%20now%20established%20frequency%20of,also%20flicker%20at%20low%20frequencies.

Anonymous 0 Comments

Beat frequencys. It’s less of an issue with led lights and LED tvs. In the old days before digital electronics the light bulbs flicker at 60hz, so did the tv because it is all on 60hz wall power. This is why movie theaters cut the lights. They run at 25hz. If the lights were on, you would see a distracting flicker. Some times you will see this flicker if you see someone recording a video of a tv screen. Also when a director on a movie says lights, camera, action!. It’s because they have to align the flicker of their lights to flicker of the camera, same with the sound syncing to the video of the mouths moving

Anonymous 0 Comments

The history of FPS and screen refresh rate can be found at the wiki page for Frame Rate ([link](https://en.wikipedia.org/wiki/Frame_rate)), but the 60 Hz comes from the frequency of the most common AC power frequency in certain regions. In USA and some other regions it is 60 Hz, which conveniently works with 60 Hz screens, while other regions (such as most of Europe) used the more ’round’ number of 50 Hz. For quite a while there were different versions of consoles and games released that had the different fps; this isn’t used as often nowadays because the hardware of the screens are more robust or has a dedicated adapter that adjusts to the proper Hz anyway.

Matching the screen refresh rate with the power refresh rate prevents a small amount of flicker by having the screen refresh at a consistent point of the power cycle.

Anonymous 0 Comments

NTSC TV use 60 Hz, more exactly 30 fps interlaced.

PAL used 50 Hz, 25 fps interlaced.

It is not a coincidence that US power grid frequency is 60 Hz compared to 50 in Europe. You do what to have the frame rate in sync with the main frequcy because of electrical interference and flicker from lights that run on AC. Eary cameras used the frequency of the power grid for synchronisation

When color was added to NTSC the colors signal is encoded in a way so old black and white TV would still work. There was some problem with a pattern with dots that could emerge so a small change to 59.94 Hz resulted in them moving around and not being easy to see.

PAL encodes color in another way, it is developed later and avoid that problem NTSC had it could keep 50 HZ.

50/60 HZ is around the lowest refers rate possible where humans do not see flicker. Movies avoid this by showing each frame multiple times. The shutter in the projector dims the light two or three times for each frame and the higher flicker rate even if the displayed image is identical results in humans not noticing it.

So 60 HZ lives one because of is history. It is often good enough. A 100 Hz display will cost more and you need more GPU power to read at a higher frequency. So 60 live on as a lower limit.

Anonymous 0 Comments

In America electricity is provided at 60hz. Older displays derived their refresh rate directly from the socket outlet, hence 60hz.

Modern displays you’re right can probably choose whatever they want. It’s just convention.

As for why the mains electricity is at 60hz, I don’t actually know. Elsewhere in the world it’s 50hz.