I don’t know if I’m explaining this right…
A computer can run logic at some speed based on how powerful the components of it are, so if it can perform the logic of something, for example, movement in a game, how does it know how much should be done based on its power, instead of essentially running in “fast-forward” or conversely in slow motion?
In: 1307
Games run in ticks. Small slices of time. In the past days of old, it’d be per frame. I.e. old Bethesda games, some bad console ports will run sped up if you uncap the frame rate.
Nowadays most engines define their ticks in seconds. I.e. 40 ticks a second, 100, bla.
In the super old past ticks used to be based on CPU clock cycles, but no longer.
Latest Answers