Why do games lag when not using 100% of system resources?

108 views

You’d think that games would max out your cpu or gpu or ram and THEN start to have performance problems, but most games can have fps drops when only using about 5-10% of all resources available to them

In: 3

3 Answers

Anonymous 0 Comments

Is the game internet based or local? If its internet based It could easily be packet loss or lag between your system and wherever the servers are.

Some programs require one step to happen before another. If it doesn’t happen fast enough like loading a map from your drive that could be an issue.

A lot of other things could also happen.

Anonymous 0 Comments

“System resources” is a very abstract label for a lot of heterogenous things that can affect performance in many different ways, so we’ll have to get specific.

So what causes latency (lag)? It depends on the task. Tasks can have many different performance characteristics, one of which is its “boundedness” profile.

A task that is CPU bound spends most of its time waiting on the CPU to execute instructions. On the other hand, a task that is IO (e.g., network, storage) bound spends most of its time waiting on IO.

So for example, if a task is IO bound, the fastest CPU in the world won’t make it run faster, because its nature is such that most of the time, the CPU can’t do anything but sit idle waiting on IO to finish before it can get back to executing instructions.

Likewise, a task that is GPU bound is dominated by waiting on the GPU to compute stuff, and the fastest CPU in the world and more memory won’t speed it up, because those aren’t the bottlenecks.

You have to identify where the bottlenecks are. If the CPU usage is low, it’s probably because most of the time the CPU is sitting idle waiting for something else (could be network, could be disk, could be memory) to complete so it can get back to executing instructions.

Moreover, other factors at play include how modern OSes handle preemptive scheduling and allocation of resources.

For context, CPU % is just a model for how much CPU time a task gets before it gets preempted and booted to make room for another task. The OS usually handles how it wants to prioritize processes when scheduling. So 100% utilization means the CPU has something scheduled to run every millisecond it’s available. But that’s up to the OS. Just because there’s spare CPU cycles available doesn’t mean the OS is going to give a random task all of those cycles.

Additionally, if your CPU is just slow (it can’t churn thru instructions very quickly), then giving it 100% CPU time just means it gets the whole CPU to itself in terms of timeshare, but that won’t help if the CPU inherently can’t process very quickly.

Anonymous 0 Comments

Imagine you’re in a store. It’s not too busy, they have 6 staff members and 4 registers, but only 2 are in use cause it’s pretty quiet. The rest are doing other jobs, stocking, cleaning, helping old people find the canned vegetables etc.

Then you get a rush of people and the 2 registers aren’t keeping up so they need to get the other registers open to keep up, but now there are less staff to clean up the spilled artichokes on isle 2 and Mrs Franklin can’t find anyone to point her at the mushroom soup.

Power saving and efficiency heuristics mean that is very very rare for the various resources to be maximally utilized, instead there are local bottlenecks that cause delays. Disk to memory/vRAM to CPU/GPU are pipelines that aren’t always filled to capacity, but just because there are “free cycles” doesn’t mean you can get the work you need done instantly because the inputs haven’t arrived yet.