*Latency*.
Ignore “fps” for a moment and look at its inverse – time needed to render a frame. A game running at 60 fps needs 1000/60 = 16.6 ms to render a frame from start to finish. A game running at 120 fps needs only half that time.
You can feel the difference between what you see on-screen being 16 ms out of date and 8 ms out of date, even if the display is only updated at 60 Hz.
(Disclaimer: This explanation relies on a number of assumptions related to how the driver does vsync. If you have vsync disabled or set to triple buffered, my explanation holds. It also does so if you have some sort of “smart vsync” that tries rendering as late as possible. But in most games, if you just enable vsync and nothing else, the above explanation breaks down.)
Latest Answers