Why do the refresh rates on computer monitors seem to always be 144hz or a multiple of 60?

788 views

Why do the refresh rates on computer monitors seem to always be 144hz or a multiple of 60?

In: Technology

4 Answers

Anonymous 0 Comments

It’s due to historical content.

60Hz was the US broadcast standard because in the US, electricity is nominally 110VAC 60Hz so we want a multiple of 60Hz to prevent flickering. There’s a huge amount of “made for TV” content that was recorded in 60Hz. (Actually, 59.94 Hz in the NTSC color system). In the US, 60Hz is still standard for broadcast and YouTube also has a lot of 60Hz (60 fps) content.

24Hz was the “de facto” standard for movies. This number isn’t good or bad — it was a compromise rate due to the technology available in the early days of motion pictures. Even today the vast majority of movies are still shot at 24Hz.

120Hz is a sweet spot for TVs and monitors because 120 is a whole multiple of both 60 and 24. That way 60Hz and 24Hz content can be shown on a 120Hz display without worrying about weird artifacts.

144Hz monitors were introduced to get a little extra performance for competitive gaming. 144 is still a multiple of 24 so most movies can also be easily rendered without artifacts. But without some fancy processing or rate matching, 60Hz content can look bad on a 144Hz monitor (choppy / stuttering, “tearing”, etc.)

So 144Hz is a compromise of sorts, favoring gaming over some broadcast TV / YouTube content.

You are viewing 1 out of 4 answers, click here to view all answers.