Simple answer is that they don’t! They guess. A computer isn’t able to know how long it will take to perform an action before it does that action, they dont have foresight. The best it can do is make an educated guess.
Theres a bunch of different metrics we could use to do this. In general were going to be timing how long is takes to do X percentage of the operation, and then extrapolate, i.e if it took 5 minutes to do 10%, we can guess it’ll take 50 minutes for 100%, 50-5 = 45 minutes left. The problem is in the definition of “percentage of the operation”. What are we measuring, exactly?
In older systems, the easy answer was that a program contains more than 1 file. So if we are installing a program made up of 1500 files, we can just say that 500 files is 1/3rd of the program downloaded, and use that for timing. The problem of course is that files aren’t all the same size. This is why loading bars on older software will often get stuck at certain percentages; If a program is 1000MB across a thousand files, but one of them is 500MB, this is a very poor metric. It is however easy to implement, and doesn’t require the computer to work very hard to give you an estimate.
More complex solutions can be measuring file size directly. we know that our install size is 1000mb, and if we’ve installed 500mb we know were half down. The problem is this is a more difficult thing to measure, requiring the computer to continuously check how much has been written to disk. This can make installation slower, but on more modern machines this is not too much of a concern. It is however still not a perfect metric. It might be more difficult for some aspects of the program to be written to disk due to the location on the disk its being saved, slow down from other programs, and just random chance.
A mixture of these methods, and others, can be used. In the end, as long as we can give a “good enough” guess it doesnt need to be 100% accurate.
Latest Answers