Why do the refresh rates on computer monitors seem to always be 144hz or a multiple of 60?

780 views

Why do the refresh rates on computer monitors seem to always be 144hz or a multiple of 60?

In: Technology

4 Answers

Anonymous 0 Comments

TV broadcasts in the US are 60Hz, so monitor refresh rates are usually something that divides nicely into this. The 144Hz thing is because people took displays that were 120Hz and overclocked them to get the highest possible refresh rate with that hardware. 20% over was more or less as high as the panels could go before crapping out. Seeing the market, some display manufacturers did “factory overclocked” displays with 144 Hz, and it became a standard. (Very few companies make the actual display panels – most simply buy a panel from one of the few manufacturers and configure them with a display controller).

The reason for 60Hz TV broadcasts is that AC current in the US is 60Hz. This means that traditional light bulbs actually vary their brightness at 60Hz, so if you try to film something at another refresh rates, you get an uneven brightness over time.

Anonymous 0 Comments

I don’t know too many details, I think it’s mostly a matter of scalability and standards.

Say you have a 30 Hz monitor and want to watch a 31 Hz video. This would be “ugly” to scale because the refreshes are misaligned, so you’d have to play with doubling some frames and stuff.

Much better if your monitor is 30 Hz and you want to watch 60 Hz content, because you can just pick every other frame. Of course this is also true for the reverse, like watching low frame rate content on a high frame rate monitor.

For this reason it is just more convenient to just work with multiples of previously existing frame rates, or at least with numbers with a convenient division.

Where the numbers come from, instead, has more of historical roots. For example, 30 Hz comes from old CRT TVs that worked at 30 Hz because in North America the household power sockets are at 60 Hz, and TVs were built to draw half the image for every cycle in the mains voltage (thus the refresh rate is halved). I don’t know too many details on this matter though, I have no idea where 144 Hz comes from – it might be connected to 24 Hz movies, but I don’t know.

Anonymous 0 Comments

What I remember is that displays’ refresh rate are usually 60Hz because the human brain processes vision at 60Hz. This means that if a light bub were to flicker at you for 60 times a second, you would see it as if it wasn’t flickering. This also means that if something flickers at less than 60Hz, you’d notice it.

But you’d quickly run into a problem. Sometimes like a flickering bulb, how bright a display is at any given moment is not constant i.e. the brightness of the bulb you see depends on when you look at it. One easy way to bypass this is to double the frequency of flickering (120Hz). That way, if two people were to look at a display at different times, they’re twice as likely to see the same thing. That goes for 180Hz, 240Hz, and so on. I don’t know about the 144Hz though.

Anonymous 0 Comments

It’s due to historical content.

60Hz was the US broadcast standard because in the US, electricity is nominally 110VAC 60Hz so we want a multiple of 60Hz to prevent flickering. There’s a huge amount of “made for TV” content that was recorded in 60Hz. (Actually, 59.94 Hz in the NTSC color system). In the US, 60Hz is still standard for broadcast and YouTube also has a lot of 60Hz (60 fps) content.

24Hz was the “de facto” standard for movies. This number isn’t good or bad — it was a compromise rate due to the technology available in the early days of motion pictures. Even today the vast majority of movies are still shot at 24Hz.

120Hz is a sweet spot for TVs and monitors because 120 is a whole multiple of both 60 and 24. That way 60Hz and 24Hz content can be shown on a 120Hz display without worrying about weird artifacts.

144Hz monitors were introduced to get a little extra performance for competitive gaming. 144 is still a multiple of 24 so most movies can also be easily rendered without artifacts. But without some fancy processing or rate matching, 60Hz content can look bad on a 144Hz monitor (choppy / stuttering, “tearing”, etc.)

So 144Hz is a compromise of sorts, favoring gaming over some broadcast TV / YouTube content.