Why do the refresh rates on computer monitors seem to always be 144hz or a multiple of 60?

784 views

Why do the refresh rates on computer monitors seem to always be 144hz or a multiple of 60?

In: Technology

4 Answers

Anonymous 0 Comments

I don’t know too many details, I think it’s mostly a matter of scalability and standards.

Say you have a 30 Hz monitor and want to watch a 31 Hz video. This would be “ugly” to scale because the refreshes are misaligned, so you’d have to play with doubling some frames and stuff.

Much better if your monitor is 30 Hz and you want to watch 60 Hz content, because you can just pick every other frame. Of course this is also true for the reverse, like watching low frame rate content on a high frame rate monitor.

For this reason it is just more convenient to just work with multiples of previously existing frame rates, or at least with numbers with a convenient division.

Where the numbers come from, instead, has more of historical roots. For example, 30 Hz comes from old CRT TVs that worked at 30 Hz because in North America the household power sockets are at 60 Hz, and TVs were built to draw half the image for every cycle in the mains voltage (thus the refresh rate is halved). I don’t know too many details on this matter though, I have no idea where 144 Hz comes from – it might be connected to 24 Hz movies, but I don’t know.

You are viewing 1 out of 4 answers, click here to view all answers.