I don’t know if I’m explaining this right…
A computer can run logic at some speed based on how powerful the components of it are, so if it can perform the logic of something, for example, movement in a game, how does it know how much should be done based on its power, instead of essentially running in “fast-forward” or conversely in slow motion?
In: 1307
Imagine starting math class on the first day of school and the teacher hands everyone a thick pile of printouts containing every day’s homework assignments for the whole year of class. If you’re super good and super fast at math you might get to work and finish super-early – maybe even weeks or months early.
Now imagine starting a math class and on the first say of school the teacher hands everyone a single homework assignment. On the second day they hand out a second assignment, on the third day they hand out the third assignment, etc. There is no way for you to finish early (no matter how good or fast at math you are) if this is how the teacher hands out the homework assignments. You might finish each assignment in 5-seconds instead of 5-minutes… but there’s no way you’ll get out of sync with the rest of the class.
————————————————————————–
Videogames and audio/video playback don’t have “fast forward” problems because they are programed to “hand out assignments” at predetermined well-scheduled intervals like the second example.
—————————————————————————————————————-
However, if you emulate some very old videogames on modern hardware you can sometimes run into the exact kind of “fast-forward” problem you describe.
Usually it happens because the game was only ever meant to be played on one very specific piece of console hardware… and so no assignment-schedule programming was done because “handing out the whole pile of homework” was slightly easier to program and was assumed to run the same way on the same piece of hardware every single time.
Latest Answers