I’m playing a game and the loading screen shows that the loading is 0% for 10 seconds then jumps to 95% for another 8 seconds, and then the next digits change every second until it reaches 100%.
Computers have been like this forever. Isn’t there a better way to do loading times/completion % so it’s more accurate?
In: 6
Because it isn’t a single operation, and different operations take different time to complete.
Even something as simple as transferring files, how do you segment up the bar? If you transfer 1000 files ranging from 1MB to 10GB, do you do each file as 0.1%, or does each byte have its own percent?
If you did it by file, it would be obviously wrong because the 10GB file would not take the same amount of time as the 1MB file. However, if you do it by space, then the operations that are constant with each file (such as looking up where it is on the disk) are not properly represented.
Latest Answers