Say you’re making a program (app, game, whatever) and you want to show the user that behind the scenes *something* is happening so that they don’t think the program is frozen or locked up. You could go with a spinning circle or hourglass, that will at least tell them something is happening, but it doesn’t give the impression that the process has an *end* or *finish* that it’s working toward.
So you put in a progress bar. Great. Except… How do you know exactly how much progress to show? How far along is it? You’ve got to measure something.
Maybe you’re copying files. The easiest thing to do is count how many files you’re copying, divide up the progress bar into that many equal sections, then every time a file finishes copying you move the bar along a chunk.
But there’s a problem with that. What if you have 9 teeny tiny files, like text files that have config info or key bindings or something, and one HUGE file that has all the graphical textures or music or something. You’ve got 10 files, but the 9 small ones will jump the progress bar ahead quickly, and it’ll kind of stop for a long time for the big one. If the big one is near the start, it goes slowly at first, then jumps from 10% to 100% super fast. If it’s at the end the first 90% happens instantly but the last 10% takes **forever** to finish.
So you need something better. With files, maybe you could add up the total size of everything in MB, then divide the bar up by that. But then you need to spend extra time gathering that info and you need a way to know how much of a file has been copied. That’s more complex. It’ll make the bar behave better, but if you tell the boss that you can have the first way done this morning and the second way done by the end of the week, he’ll be like “It’s a progress bar. Who cares? Just get it done quick, the project is already behind”…
And that’s if you’re counting file copying. Other operations and processes have their own challenges finding things to count to indicate how far along they are.
So you think “wait, I could count TIME and use that!”. So you run the process once, time how long it takes in seconds, and tell the bar that it takes X seconds to go from 0 to 100.
But then what happens if the client/user’s computer is slower or faster at doing the process than yours? You could have a progress bar that gets “stuck” at 100% for two minutes because it counted up the three minutes it took on your machine, but it will take five here. Or your process could finish when the bar is at 50% because their computer is super fast. What do you do then? Do you code a way to see if the process is done then just skip to 100% if it finishes early? Or do you just let it count to 3min no matter how long it takes?
Hopefully this look behind the scenes of building a program with a progress bar sheds some light on why it’s so hard to make a smooth and accurate one, and why so many of them suck.
And of course, relevant XKCD from the days when the Windows File Copy dialog counted the old way (# of files) instead of the new way (% of bytes total):
https://xkcd.com/612/
Latest Answers