Eli5: Why screen refresh rates are 60Hz, 90Hz, 120Hz, and 144Hz? Why not other numbers?

156 views

Eli5: Why screen refresh rates are 60Hz, 90Hz, 120Hz, and 144Hz? Why not other numbers?

In: 197

6 Answers

Anonymous 0 Comments

They are multiples of historically adopted frame rates of 30 and 24. The 30 comes from American electric frequency. If you display ordinary video on a monitor with an even multiple of refresh rate, it will look marginally smoother becase each input frame gets displayed for the same number of output frames.

Anonymous 0 Comments

Screen refresh rate for modern computer displays comes from the vertical frequency of old analoge color CRT TVs.

The NTSC standard used in places like the US had a frequency of 60Hz (technically 59.94Hz) and PAL/SECAM used in places like Europe had 50Hz.

They inherited that from the frequency of the electrical grid which is 50Hz in much of the world and 60Hz in parts of Asia and the Americas.

Meanwhile old analoge movie reels ended up standardizing on having 24 frames per second on films.

Since much of the old analoge content was in 24fps, 25fps or 30fps and the old TVs and computer displays worked with those frequencies, it made sense to have higher displays refresh rate in multiples of those existing standards.

If you have a source that is in 30Hz you just show each frame twice or trice rather than anything more complicated. (144 is 6 times 24 for example)

Anonymous 0 Comments

It comes down to the origins of the technology itself.

Way back when people were just starting to make movies and talkies (a movie (moving picture) that also included synchronized sound, instead of just a small band playing in the background) the technology was pretty primitive. People would film a scene by someone literally cranking the handle on a camera, so the frame rate was just whatever speed the person spun the handle. It was then played back at a similarly imprecise rate. You will probably notice some old film has people moving at high speed, which is because the person filming was cranking slower than the film is played back today (film was expensive!).

Eventually though the movie industry figured out that 24 frames per second is about the minimum required to get the perception of smooth movement instead of just a slide show of jerky jumps. Some people have a modern misconception that this is somehow the limit of human perception (it certainly is not) or that it is superior in some way (again, wrong). It is just the bare minimum below which is nauseating. It has of course become the de-facto standard to this day. 144 Hz happens to be exactly 6 times that 24 fps rate which means you can display movies in this old, slow frame rate without any delaying of frames which would introduce jumps or jerks.

Again going back into the past people were just figuring out this whole electricity thing, and they needed to decide what frequency the electrical grid and all the devices that would work with it should operate at. The US decided on 60 Hz because it was most efficient, and it probably helped that it fit nicely into our timekeeping system (60 cycles per second, 60 seconds per minute, 60 minutes per hour…). Europe though of course wanted a monopoly on equipment so they picked 50 Hz.

Eventually people started building vacuum tube televisions to display videos and this required sweeping an electron beam across the tube using electromagnets. All this was much easier if it happened at a multiple of the grid frequency, so US television synchronized at 30 fps (and later on 29.97 fps because they needed to cram in hue and saturation values for color television). Europe standardized on 25 fps since their grid frequency was 50 Hz, meaning movies filmed at 24 fps needed to be sped up 4% for display on their equipment.

Modern screen refresh rates of 60, 90, and 120 Hz then are just multiples of the base 30 fps which was motivated by using AC from the grid to drive a vacuum tube display, which of course is now completely irrelevant to a liquid crystal display with an LED backlight. But as they say, “TRADITION!!”

Anonymous 0 Comments

60Hz is the lowest commonly agreed upon refresh rate at which rapid motion looks smooth – for example, the motion in first person shooters.

Yes, there are many circumstances in which video will appear smooth despite a lower refresh rate.

Yes, it is still possible to distinguish the difference between a 60Hz monitor and higher refresh rate monitors when there is high-speed motion being rendered.

Despite that, 60Hz is the generally agreed upon refresh rate at which motion will look *smooth* – not perfect – under all circumstances.

The first video cable to economically cater to people displaying high motion video on a monitor was the DVI cable. DVI Single Link was designed to provide enough bandwidth to support 60Hz at what is basically 1080p. DVI Dual Link was designed to provide enough bandwidth to support 60Hz at what is basically 4k. This is why 60Hz monitors exist.

If you stick a DVI Dual Link cable on a 1080p monitor, the extra bandwidth that the cable provides allows for a maximum refresh rate of 144Hz. This is why 144Hz monitors exist.

120Hz monitors exist because early HDMI cables could support 720p monitors at 120Hz. 120Hz monitors are cheaper to produce than 144Hz monitors for a negligible difference in quality and so continue to be produced for that reason.

90 Hz monitors exist because 90 is halfway between 60 and 120. Humans like to buy products that are halfway between the “low end” and “high end” models for any given product, and thus 90Hz monitors were born.

Modern HDMI cables can support refresh rates far in excess of 144Hz at 1080p. Nonetheless, it isn’t economical to produce monitors with refresh rates higher than 144Hz, plus the difference in perceived performance is already negligible above 120Hz. As a result, monitors that go above 144Hz are rare and expensive.

Anonymous 0 Comments

Captain Disillusion has a pretty good [youtube video](https://youtu.be/DyqjTZHRdRs?si=Z6NWmnkEw53aocjy) on frame rates. Its 4 minutes long, worth the watch.

Anonymous 0 Comments

My old monitor (that I still have and use as a secondary screen) AND my old laptop both have 75hz screens. I’ve also seen 75hz a number of times back when I was researching cheaper monitors. I’m not sure why any particular number is chosen, but I know there is quite a large variety.