– Can anyone explain the difference between computer display “hertz”, “refresh rate” and how it corresponds to “FPS”?

349 views

I really should know this, but I never really “groked” the concepts as they interrelate.

Thanks for taking the time.

In: 12

7 Answers

Anonymous 0 Comments

Ok. So when a graphics card/chip renders or creates the scene that you see on the monitor – whether its Call of Duty or your desktop – its writing all the pixels of that scene into a frame buffer – this is a dedicated portion of RAM or VRAM reserved specifically for this purpose. As soon as the scene is completely drawn, the computer toggles a switch for the monitor saying “ok, here draw THIS on your screen now”. Until then the monitor is just showing the OLD or previous frame. This drawing of the scene or desktop or your homework in Microsoft Word is your frames per second. How fast the computer can do this depends on its CPU and GPU, display drivers etc. etc.

Meanwhile, how fast the monitor can physically update is its refresh rate. If a monitors refresh rate in hertz (which is just fancy for “times per second”) is 60 Hz that means every pixel on the monitor is getting redrawn, refreshed or changed 60 times a second.

Soooo. If your computer is spitting out Call of Duty at 120 fps, but your monitor is only updating 60 times per second, well its not going to look as smooth as the computer is capable of drawing it. In fact, because the graphics card is saying “Here, show THIS; now THIS; cmon hurry up no show THIS” faster than the monitor can refresh “hey cmon, slow down, too fast!”, what you see is tearing – where a single refresh of the monitor has multiple frame buffers sent to it through the course of the refresh; essentially a single refresh of the screen shows parts of multiple frame buffers. Conversely, if your monitor is 144Hz but your computer is only spitting 30 fps because your video card is 10 years old, well that’s not going to be awesome either. Games (or Windows or whatever), look the best when the fps the computer generates matches the refresh rate of the monitor.

So here’s where “Vsync” comes in. This tells the game engine to restrict its frame buffer generation to how fast the monitor can go. The GPU draws the scene, then waits for the monitor. “You ready for the next one? Here you go.” The downside to this is you’re artificially introducing input lag where what gets sent to the monitor may not actually match what is actually going on in the game _right now_. If the GPU finishes generating the NEXT frame before the monitor is ready to show the previous one, it might just skip that one entirely and just show the new one, which could lead to jumpy or jerky motion.

You are viewing 1 out of 7 answers, click here to view all answers.