Why you can feel the difference between 60 fps and anything higher on 60 HZ monitor?

184 views

I thought 60 HZ screen just can’t render higher framerate, but apparently it can. If it can, then why buy a monitor with higher refresh rate?

In: Technology

It can’t. The way the game works, you might have some other benefits from higher framerate (for example, in some games the fire rate of guns depend on framerate), but you won’t *see* any of those extra frames.

Think of it like looking through a window into another world. Now the window opens and closes at 60 times per second but the world on the outside side is simulated faster
say 120 times per second. The view when the window opens will be more accurate. Now if you open and close the window faster you get to see all the extra detail and accuracy that is happening.

60 Hz monitor can’t render more than 60 frames per second, unless you overclock it. You buy a monitor with higher frequency if you want to see more frames per second.

You will however feel a difference between for example 60 fps and 120 fps output from your gpu even with just 60 Hz monitor, because the monitor is getting more recent data, which will still feel smoother. (but you could experience screen tearing)

Try a 144 Hz monitor and you will instantly feel the much bigger difference tho.

60 fps isn’t 60 fps constantly. It might 70 at most times, but dip to 50 sometimes. It’s a random distribution because load on the computer changes depending on the game situation.

Having higher average FPS means less times where the *instantaneous* FPS drops below 60, which you can definitely feel. Benchmarks usually include a 90%, 99% or 99.9% minimal FPS along with the average to reflect those random changes. 99% 60fps means that if you played for 100 seconds, we expect 1 second with less than 60 fps.

*Latency*.

Ignore “fps” for a moment and look at its inverse – time needed to render a frame. A game running at 60 fps needs 1000/60 = 16.6 ms to render a frame from start to finish. A game running at 120 fps needs only half that time.

You can feel the difference between what you see on-screen being 16 ms out of date and 8 ms out of date, even if the display is only updated at 60 Hz.

(Disclaimer: This explanation relies on a number of assumptions related to how the driver does vsync. If you have vsync disabled or set to triple buffered, my explanation holds. It also does so if you have some sort of “smart vsync” that tries rendering as late as possible. But in most games, if you just enable vsync and nothing else, the above explanation breaks down.)