Why do loading times and loading % on computers jump around rather than smoothly increasing the numbers?

318 views

I’m playing a game and the loading screen shows that the loading is 0% for 10 seconds then jumps to 95% for another 8 seconds, and then the next digits change every second until it reaches 100%.

Computers have been like this forever. Isn’t there a better way to do loading times/completion % so it’s more accurate?

In: 6

9 Answers

Anonymous 0 Comments

Blame it on modern hardware and software!

If you go back to really, really old hardware, things are predictable. The computer is only ever doing one thing. The computer does exactly what’s asked. The hardware is dumb and predictable. The software is dumb and predictable.

Today none of that holds true. Your game doesn’t run on the actual hardware, but on top of Windows/Mac/Linux, which have their own ideas. For performance reasons they lie to the programs executing on them, for instance saying “I wrote your saved game” immediately, while actually doing it later, but only if there’s enough free memory available. There’s other things going on in the background, like autoupdates and antivirus scanning and file indexing. All of those consume resources in different amounts at different times. So applications can’t actually reliable predict performance anymore, it’s out of their hands.

Even hardware is unpredictable. We have things like SSD drives and shingled hard disks, both of which do internal housekeeping operations that are unpredictable to the computer. An old hard disk might write at 1MB/s, completely consistently. A modern SMR one will write at 500MB/s while there’s RAM cache, then drop to 100MB/s to the CMR cache, then suddenly get stuck if it runs out of that. That’s completely out of a game’s hands, and basically unpredictable.

You are viewing 1 out of 9 answers, click here to view all answers.