– Can anyone explain the difference between computer display “hertz”, “refresh rate” and how it corresponds to “FPS”?

340 views

I really should know this, but I never really “groked” the concepts as they interrelate.

Thanks for taking the time.

In: 12

7 Answers

Anonymous 0 Comments

Hertz is a unit of frequency – basically the number of cycles per second.

A 60Hz display refreshes the screen 60 times per second. A 120Hz display refreshes the screen 120Hz per second. This is the refresh rate. This is the display’s functional cap on FPS – an application’s actual FPS will also rely on the computer being able to output that many frames per second – if you have a potato computer hooked up to a 144Hz display, chances are it’s going to be running at less than 144 FPS for most games.

The RESPONSE TIME is the time it takes a given pixel to change color and is expressed in ms.

Anonymous 0 Comments

The unit Hertz a capital H because it is named after a person. Is a measurement used in radio waves originally and it means cycle per second.

When used to refer to a display it is how frequently the physical display screen has the image updated. This is true even when the screen shows movement.

Frames per second is how frequently the video card is able to rewrite the display. The fps does not change the Hertz of the screen since the screen value is a mechanical thing vs the video card frame rate which is a software and hardware function.

The fps is based on the video card, computer and complexity of the images.

The Hertz rating is just the monitor and applied to the screen even when the computer is off but the monitor is showing its own boot display.

Anonymous 0 Comments

The refresh rate is how often the monitor/screen is capable of changing what it displays. That’s measured in Hertz, which is “number of times per second.” So if your monitor has a 60 Hz refresh rate, it is capable of changing 60 times per second.

The frame rate is how often your graphics card or processor are able to create a new image and send it to the screen. If you computer slows down because other processes are running or it’s overheating, the frame rate will decrease.

The key issue is how these interact. If the refresh rate is much higher than the frame rate, then the monitor is sitting displaying a static image and waiting for a new frame to display. If the frame rate is much higher than the refresh rate, the graphics card is sending video data to be displayed but the monitor is too slow to display it and is dropping frames, giving a potentially jerky appearance and wasted effort on the computing side.

Anonymous 0 Comments

Ok. So when a graphics card/chip renders or creates the scene that you see on the monitor – whether its Call of Duty or your desktop – its writing all the pixels of that scene into a frame buffer – this is a dedicated portion of RAM or VRAM reserved specifically for this purpose. As soon as the scene is completely drawn, the computer toggles a switch for the monitor saying “ok, here draw THIS on your screen now”. Until then the monitor is just showing the OLD or previous frame. This drawing of the scene or desktop or your homework in Microsoft Word is your frames per second. How fast the computer can do this depends on its CPU and GPU, display drivers etc. etc.

Meanwhile, how fast the monitor can physically update is its refresh rate. If a monitors refresh rate in hertz (which is just fancy for “times per second”) is 60 Hz that means every pixel on the monitor is getting redrawn, refreshed or changed 60 times a second.

Soooo. If your computer is spitting out Call of Duty at 120 fps, but your monitor is only updating 60 times per second, well its not going to look as smooth as the computer is capable of drawing it. In fact, because the graphics card is saying “Here, show THIS; now THIS; cmon hurry up no show THIS” faster than the monitor can refresh “hey cmon, slow down, too fast!”, what you see is tearing – where a single refresh of the monitor has multiple frame buffers sent to it through the course of the refresh; essentially a single refresh of the screen shows parts of multiple frame buffers. Conversely, if your monitor is 144Hz but your computer is only spitting 30 fps because your video card is 10 years old, well that’s not going to be awesome either. Games (or Windows or whatever), look the best when the fps the computer generates matches the refresh rate of the monitor.

So here’s where “Vsync” comes in. This tells the game engine to restrict its frame buffer generation to how fast the monitor can go. The GPU draws the scene, then waits for the monitor. “You ready for the next one? Here you go.” The downside to this is you’re artificially introducing input lag where what gets sent to the monitor may not actually match what is actually going on in the game _right now_. If the GPU finishes generating the NEXT frame before the monitor is ready to show the previous one, it might just skip that one entirely and just show the new one, which could lead to jumpy or jerky motion.

Anonymous 0 Comments

Hertz is a generic unit of something’s frequency, or how often it goes back and forth. If you stomp your foot 1 time/second, you will have a foot stomping frequency of 1 Hz.

A refresh rate of a monitor is how many times your monitor draws a new image in a second. 60 Hz monitors draw 60 images a second, 120 Hz makes 120 images a second, and so on.

Fps is how many times your computer refreshes what it sends to the monitor.

This means that both your monitor and your computer can be a bottleneck on the smoothness of your gameplay, because if you have 144fps going from your computer, but a 60 Hz monitor, your monitor will recieve a different image, but you’ll still only see 60 images a second. But the same is true the other way. If you have a 144 Hz monitor, but are playing a game at 30 fps, well it doesn’t matter how many times the screen draws a new image, if that image is still the same picture. It’d be like shooting slow motion video of grass growing.

Anonymous 0 Comments

Your graphics card is a webcomic artist and your monitor is the website publisher.

Because the artist is poor, the website refreshes once a day. If the artist draws one frame per day, the timing is matched so that the new comic gets published the next day.

However, sometimes the artist has issues and takes two or more days to draw a frame. When the website doesn’t get any new uploads, it just displays the same comic as before. This can lead to timing issues as the daily comic erratically updates every 2 days, then 1, then 3, etc.

Sometimes, the artist draws rapidly, making more than one frame a day! But only one frame can be uploaded a day and only the latest frame gets uploaded. You can get into a situation where you skip displaying a frame, making the story feel like it’s jumping ahead!

Your monitor refreshes at a rate in Hertz which is a measure of “per second” much like frames per second. If your FPS goes below the refresh rate, your monitor will display the same frame more than once, sometimes leading to detectable judder. If your FPS goes above the refresh rate, the monitor can’t refresh fast enough and you lose displayed frames.

New adaptive sync methods (Freesync, Gsync) tell the monitor to refresh exactly when a new frame is ready, eliminating these timing issues.

Anonymous 0 Comments

A device writes moving picture content as fast as it can, with 30 frames per second (pictures per second) being a good baseline for the human eye

The screen on which that moving picture content is displayed, updates the light that you see as quickly as it can (hertz) allowing the screen to refresh what it is being told to display. The device and the screen are separate technologies, so they operate at different velocities. When the device tells the screen to update, the screen may already be mid-update, which is when you see tearing of objects on the screen (jagged edges).

Just like blinking really fast with your eyes, you miss less if you blink faster. Screens are forced to blink fast, and the faster they blink (refresh), the less you miss. The speed of the blinking is measured in Hertz (like miles per hour), and is considered the refresh rate. The device trying to TELL THE SCREEN to write a new picture, requires resources on the device. There is a limit to how many pictures that the device can create each second.

The speed of how many pictures a device can create, is measured in frames (pictures) per second. Those frames are shared with a screen, that is blinking really fast, whose blinking is measured in refreshes per second (hertz). When you have a synchronized setup, you get the frames changing in tandem with the screen being rewritten by the display…which creates video without tearing. The faster that the device can create frames, and the display can refresh those frames, the smoother and more lifelike the video appears.