Why do the refresh rates on computer monitors seem to always be 144hz or a multiple of 60?

782 views

Why do the refresh rates on computer monitors seem to always be 144hz or a multiple of 60?

In: Technology

4 Answers

Anonymous 0 Comments

TV broadcasts in the US are 60Hz, so monitor refresh rates are usually something that divides nicely into this. The 144Hz thing is because people took displays that were 120Hz and overclocked them to get the highest possible refresh rate with that hardware. 20% over was more or less as high as the panels could go before crapping out. Seeing the market, some display manufacturers did “factory overclocked” displays with 144 Hz, and it became a standard. (Very few companies make the actual display panels – most simply buy a panel from one of the few manufacturers and configure them with a display controller).

The reason for 60Hz TV broadcasts is that AC current in the US is 60Hz. This means that traditional light bulbs actually vary their brightness at 60Hz, so if you try to film something at another refresh rates, you get an uneven brightness over time.

You are viewing 1 out of 4 answers, click here to view all answers.