eli5 Why do some graphic cards run better on different resolutions?

927 views

So, take for example the RTX 3070. The graphics card will run a game, ( we will use ghost runner as an example ) better on 1440p, and worse on 1080p.

What is the difference between 1440p and 1080p visually and mechanically?

Also, another question, why would I need a 240hz monitor if im playing a game i get over 240fps on? What is the performance difference on that? Why wouldn’t my 144hz monitor do the job?

In: Technology

3 Answers

Anonymous 0 Comments

First question: if you tweak settings it will run better at 1080p than 1440

Second question: your confusion comes from the belief your screen shows the number of frames you get. You screen can only show a certain number of screens a second (its refresh rate), a higher fps on a low refresh rate monitor makes your screen refresh with slightly more current information.

Anonymous 0 Comments

To the first question, I’m assuming that card was optimized for running at 1440, SSO when loaded like that it can distribute the process between its cores better (more efficiently and “balancedly”) than when loaded more or less.

To the second part, you would need a higher refresh rate to get more frames per second, if you have a higher fps than refresh rate, no matter how fast the GPU-CPU is producing frames and sending them to the monitor cause the monitor will only change the “image” it’s showing so many times per second, so at the end you would only “see” as many FPS as fit in that refresh rate

Anonymous 0 Comments

Well, there’s a bit going on differently at different resolutions, so let me know if I don’t quite cover it. Let’s take an unbalanced system for example. Let’s say this system has the best graphics card known to man. Like ten 3090’s strapped together with duct tape from the gods. Then on the other hand, that same computer has a CPU that hasn’t seen a good day since it was made in 1995. In this case, you have a very strong GPU, but a very weak CPU.

Now let’s look at how computers process game information to make the frames you see on screen. Different parts of the computer will be used depending on a number of factors, but specifically let’s talk resolution. The higher the resolution, the more the game needs your GPU, and the less it needs your CPU. The lower the resolution, the more CPU-bound your load gets. So if you’re running at a super high resolution, your GPU is going to bear the brunt of everything, and your CPU won’t be working as hard.

To wrap it all together, let’s bring the hypothetical computer from earlier back into the equation. Since the CPU on that machine is extremely weak, you can’t put a lot of stress on it, so you’ll need to run at a crazy high resolution. Since the GPU is an SCP capable of more power than Anakin Skywalker himself, you can cranck the resolution as high as you want, to put all the stress on the GPU that can handle it. You could get a billion frames per second running at 12K, but as soon as you start depending on your CPU, say at 1080p, you’re going to drop down to the single digits because your CPU can’t perform. This problem gets exacerbated in games like open world RPG’s that need a lot of geometry to function, as that is taxing on your CPU as well.