Why do the majority of games aim to run at either 30 or 60 frames per second? Is there any reason those two numbers are most commonly chosen? Why not aim for 40 or 50?

263 views

Why do the majority of games aim to run at either 30 or 60 frames per second? Is there any reason those two numbers are most commonly chosen? Why not aim for 40 or 50?

In: 9

4 Answers

Anonymous 0 Comments

Most consumer monitors have a refresh rate of 60hz. As such, a framerate of 40/50/etc wouldn’t line up with that, and frames would either be delayed or the monitor starts displaying the next frame before it finishes drawing the first (screen tearing).

Anonymous 0 Comments

Computer monitors “refresh” at a certain rate. That describes how fast they update the screen. Running the game at a frame rate faster than a monitor’s refresh rate means you have rendered frames that can’t even be displayed on the monitor, so from that perspective it’s a waste.

60Hz (“Hertz” is the unit for “times per second”) was a very common monitor refresh rate because in the US, our electrical system changes from positive to negative at a rate of 60Hz. So monitors use that characteristic of the electrical supply to control their timing. Refreshing at a different rate means they’d need to add hardware to keep precise timing at the different rate.

30 FPS is also common for two reasons. One is because TVs in the US settled on a refresh rate of 29.97 Hz. (The reason for that very precise number is long and technical.) The other is because `60 / 2 = 30` so it’s still easy to use a 60Hz electrical system to keep track of if you’re updating at 30Hz. (This is also why 120Hz is a standard faster refresh rate: it’s just `60 * 2`. If 60 is the “standard” then it’s always easy to refresh at some multiple of it.)

This actually caused problems in games in other regions of the world. Lots of places use a 50Hz electrical supply. They chose other refresh rates for their TVs and monitors because of that, but not all video games decided to adjust. That means those games run “too fast” in those countries because they count to 60 when they should only be counting to 50!

So long story short: the main reason has to do with TV/monitor refresh rates, and the earliest TVs/monitors chose their refresh rates based on the electrical power systems they were connected to. I’m not exactly sure why the US chose 60Hz, why other people chose 50Hz, or why we didn’t just all agree to use the same standard.

Anonymous 0 Comments

Most older TV’s in North America and Japan had a refresh rate of 29.97 Hz. That’s basically 30 Hz. You wanted your game console to produce a new frame every time the screen refreshed to keep up with it.

As TV’s got more advanced and digital, a lot of them went up to 60 Hz refresh rate. Again, that becomes a target for game consoles.

As for why these numbers, the North American and Japanese power grids operate at 60 Hz. Older devices could use the frequency of the electrical grid as its timing clock, rather than having to implement their own clock. The 29.97Hz was originally 30 Hz but had to be slightly adapted for Colour TV.

Anonymous 0 Comments

It is worth noting that the traditional 30/60Hz refresh rates were typically a product of CRT and early flat panel TV sets – most modern screens are no longer tied to the power supply frequency and can be run with variable refresh rates to suit the input.