I don’t know if I’m explaining this right…
A computer can run logic at some speed based on how powerful the components of it are, so if it can perform the logic of something, for example, movement in a game, how does it know how much should be done based on its power, instead of essentially running in “fast-forward” or conversely in slow motion?
In: 1307
IIrc for games there’s the update function, a part of the game’s code that executes as fast as possible, there’s also fixed update, which updates every X milliseconds, time given by a small clock, there’s deltaTime which is a fancy way to determine how much we have to delay the current frame for it to not be faster than normal without using a fixed update rate.
This used to be a problem back in the day with some systems that would run certain games but which later were ran on faster processors, resulting in faster gameplay.
If it weren’t for these functions, we’d run something like the Atari Breakout at 9000x speed.
As for videos, ask someone that codes in that domain.
Most apps (I think) use ffmpeg, a library that handles most of the stuff, the ones that don’t I have no idea about.
Latest Answers