I’m guessing modern frameworks have this built-in so developers don’t really have to worry so much, but how is it achieved? For example, if I want a character to move at a certain pace no matter what hardware is running, or how much is happening on the screen at the time, how does the code control that?
In: 2
You base the speed on some kind of timer instead of tying to CPU clock speed or framerate (looking at you Skyrim!).
The functions are pretty much built-in these days.
Some older games definitely depended on CPU clock speed though.
EDIT: Here’s an example if you’re interested: https://gafferongames.com/post/fix_your_timestep/
EDIT 2: Absolute accuracy is also not needed like you would need on a solar probe where every clock cycle counts: https://arstechnica.com/features/2020/10/the-space-operating-systems-booting-up-where-no-one-has-gone-before/. If there are some small infinitesimal errors (as far as human perception is concerned) is fine.
Latest Answers