– Can anyone explain the difference between computer display “hertz”, “refresh rate” and how it corresponds to “FPS”?


I really should know this, but I never really “groked” the concepts as they interrelate.

Thanks for taking the time.

In: 12

Hertz is a unit of frequency – basically the number of cycles per second.

A 60Hz display refreshes the screen 60 times per second. A 120Hz display refreshes the screen 120Hz per second. This is the refresh rate. This is the display’s functional cap on FPS – an application’s actual FPS will also rely on the computer being able to output that many frames per second – if you have a potato computer hooked up to a 144Hz display, chances are it’s going to be running at less than 144 FPS for most games.

The RESPONSE TIME is the time it takes a given pixel to change color and is expressed in ms.

The unit Hertz a capital H because it is named after a person. Is a measurement used in radio waves originally and it means cycle per second.

When used to refer to a display it is how frequently the physical display screen has the image updated. This is true even when the screen shows movement.

Frames per second is how frequently the video card is able to rewrite the display. The fps does not change the Hertz of the screen since the screen value is a mechanical thing vs the video card frame rate which is a software and hardware function.

The fps is based on the video card, computer and complexity of the images.

The Hertz rating is just the monitor and applied to the screen even when the computer is off but the monitor is showing its own boot display.

The refresh rate is how often the monitor/screen is capable of changing what it displays. That’s measured in Hertz, which is “number of times per second.” So if your monitor has a 60 Hz refresh rate, it is capable of changing 60 times per second.

The frame rate is how often your graphics card or processor are able to create a new image and send it to the screen. If you computer slows down because other processes are running or it’s overheating, the frame rate will decrease.

The key issue is how these interact. If the refresh rate is much higher than the frame rate, then the monitor is sitting displaying a static image and waiting for a new frame to display. If the frame rate is much higher than the refresh rate, the graphics card is sending video data to be displayed but the monitor is too slow to display it and is dropping frames, giving a potentially jerky appearance and wasted effort on the computing side.

Hertz is a generic unit of something’s frequency, or how often it goes back and forth. If you stomp your foot 1 time/second, you will have a foot stomping frequency of 1 Hz.

A refresh rate of a monitor is how many times your monitor draws a new image in a second. 60 Hz monitors draw 60 images a second, 120 Hz makes 120 images a second, and so on.

Fps is how many times your computer refreshes what it sends to the monitor.

This means that both your monitor and your computer can be a bottleneck on the smoothness of your gameplay, because if you have 144fps going from your computer, but a 60 Hz monitor, your monitor will recieve a different image, but you’ll still only see 60 images a second. But the same is true the other way. If you have a 144 Hz monitor, but are playing a game at 30 fps, well it doesn’t matter how many times the screen draws a new image, if that image is still the same picture. It’d be like shooting slow motion video of grass growing.

Ok. So when a graphics card/chip renders or creates the scene that you see on the monitor – whether its Call of Duty or your desktop – its writing all the pixels of that scene into a frame buffer – this is a dedicated portion of RAM or VRAM reserved specifically for this purpose. As soon as the scene is completely drawn, the computer toggles a switch for the monitor saying “ok, here draw THIS on your screen now”. Until then the monitor is just showing the OLD or previous frame. This drawing of the scene or desktop or your homework in Microsoft Word is your frames per second. How fast the computer can do this depends on its CPU and GPU, display drivers etc. etc.

Meanwhile, how fast the monitor can physically update is its refresh rate. If a monitors refresh rate in hertz (which is just fancy for “times per second”) is 60 Hz that means every pixel on the monitor is getting redrawn, refreshed or changed 60 times a second.

Soooo. If your computer is spitting out Call of Duty at 120 fps, but your monitor is only updating 60 times per second, well its not going to look as smooth as the computer is capable of drawing it. In fact, because the graphics card is saying “Here, show THIS; now THIS; cmon hurry up no show THIS” faster than the monitor can refresh “hey cmon, slow down, too fast!”, what you see is tearing – where a single refresh of the monitor has multiple frame buffers sent to it through the course of the refresh; essentially a single refresh of the screen shows parts of multiple frame buffers. Conversely, if your monitor is 144Hz but your computer is only spitting 30 fps because your video card is 10 years old, well that’s not going to be awesome either. Games (or Windows or whatever), look the best when the fps the computer generates matches the refresh rate of the monitor.

So here’s where “Vsync” comes in. This tells the game engine to restrict its frame buffer generation to how fast the monitor can go. The GPU draws the scene, then waits for the monitor. “You ready for the next one? Here you go.” The downside to this is you’re artificially introducing input lag where what gets sent to the monitor may not actually match what is actually going on in the game _right now_. If the GPU finishes generating the NEXT frame before the monitor is ready to show the previous one, it might just skip that one entirely and just show the new one, which could lead to jumpy or jerky motion.