I don’t know if I’m explaining this right…
A computer can run logic at some speed based on how powerful the components of it are, so if it can perform the logic of something, for example, movement in a game, how does it know how much should be done based on its power, instead of essentially running in “fast-forward” or conversely in slow motion?
In: 1307
First let’s re-frame the question — what you’re really asking about is the “Clock Speed” of a computer. Simply put a “thing” in a computer happens with each clock “tick.” When you see “5 Gigahertz” computer, that mean it can do 5 Billion “things” in one second. (These are very small things, not ‘move a character from point A to B,’ more like ‘add two numbers’)
So your real questions is “How do computers with different clock speeds play games at the same rate.” The answer is because they use actual time to figure out how quickly to move things in games or animation or videos rather than relying on clock speed.
It’s actually a very good question because games used to use the clock speed of the computer for movement, but when the games became unplayable on faster computers they switched to using actual time.
Latest Answers