Eli5: Why screen refresh rates are 60Hz, 90Hz, 120Hz, and 144Hz? Why not other numbers?

160 views

Eli5: Why screen refresh rates are 60Hz, 90Hz, 120Hz, and 144Hz? Why not other numbers?

In: 197

6 Answers

Anonymous 0 Comments

It comes down to the origins of the technology itself.

Way back when people were just starting to make movies and talkies (a movie (moving picture) that also included synchronized sound, instead of just a small band playing in the background) the technology was pretty primitive. People would film a scene by someone literally cranking the handle on a camera, so the frame rate was just whatever speed the person spun the handle. It was then played back at a similarly imprecise rate. You will probably notice some old film has people moving at high speed, which is because the person filming was cranking slower than the film is played back today (film was expensive!).

Eventually though the movie industry figured out that 24 frames per second is about the minimum required to get the perception of smooth movement instead of just a slide show of jerky jumps. Some people have a modern misconception that this is somehow the limit of human perception (it certainly is not) or that it is superior in some way (again, wrong). It is just the bare minimum below which is nauseating. It has of course become the de-facto standard to this day. 144 Hz happens to be exactly 6 times that 24 fps rate which means you can display movies in this old, slow frame rate without any delaying of frames which would introduce jumps or jerks.

Again going back into the past people were just figuring out this whole electricity thing, and they needed to decide what frequency the electrical grid and all the devices that would work with it should operate at. The US decided on 60 Hz because it was most efficient, and it probably helped that it fit nicely into our timekeeping system (60 cycles per second, 60 seconds per minute, 60 minutes per hour…). Europe though of course wanted a monopoly on equipment so they picked 50 Hz.

Eventually people started building vacuum tube televisions to display videos and this required sweeping an electron beam across the tube using electromagnets. All this was much easier if it happened at a multiple of the grid frequency, so US television synchronized at 30 fps (and later on 29.97 fps because they needed to cram in hue and saturation values for color television). Europe standardized on 25 fps since their grid frequency was 50 Hz, meaning movies filmed at 24 fps needed to be sped up 4% for display on their equipment.

Modern screen refresh rates of 60, 90, and 120 Hz then are just multiples of the base 30 fps which was motivated by using AC from the grid to drive a vacuum tube display, which of course is now completely irrelevant to a liquid crystal display with an LED backlight. But as they say, “TRADITION!!”

You are viewing 1 out of 6 answers, click here to view all answers.