Typically the higher the performance, the lesser the quality. Same can be said in reverse. To make a scene look really good takes a lot of processing power and that lowers framerate. Alternatively, to get to the advertised 120hz, lowering the picture quality frees up that processing power and will raise the fps. Consoles only have so much processing power so they can’t consistently do 4k 120hz. Some less demanding games could probably do both but newer titles can’t.
To render a single frame of a scene in a video game, a load of calculations have to be done. The nicer the graphics, and the higher the resolution, the more calculations have to be done to render the frame.
At 120Hz the console can dedicate half the computation time per frame than at 60Hz and a quarter of the time that it can dedicate at 30Hz.
So for developers to make games with the best possible graphics, they dial down the frame rate to allow for more computational time per frame.
It’s also not just graphics, imagine a game with a lot of characters on screen. The game has to compute what each of those characters are doing, their movement e.t.c every frame. The game might support 50 characters on screen at 120Hz and 100 at 60Hz. The 100 character game may be better gameplay wise, so that’s what’s chosen.
Unless you’re playing competitively, or on the absolute hardest difficulties, then honestly above 60Hz doesn’t affect your experience that much (especially with a controller).
Its basically zero developer effort to add in an option to reduce graphics for higher FPS – and that only needs to sell you a handful more copies of the game to have been worth doing.
Latest Answers