How do game developers make games run at a certain speed when they have to run on different hardware?

256 views

I’m guessing modern frameworks have this built-in so developers don’t really have to worry so much, but how is it achieved? For example, if I want a character to move at a certain pace no matter what hardware is running, or how much is happening on the screen at the time, how does the code control that?

In: 2

4 Answers

Anonymous 0 Comments

You base the speed on some kind of timer instead of tying to CPU clock speed or framerate (looking at you Skyrim!).

The functions are pretty much built-in these days.

Some older games definitely depended on CPU clock speed though.

EDIT: Here’s an example if you’re interested: https://gafferongames.com/post/fix_your_timestep/

EDIT 2: Absolute accuracy is also not needed like you would need on a solar probe where every clock cycle counts: https://arstechnica.com/features/2020/10/the-space-operating-systems-booting-up-where-no-one-has-gone-before/. If there are some small infinitesimal errors (as far as human perception is concerned) is fine.

Anonymous 0 Comments

Back in the day game speed used to be tied to CPU speeds which would be a major problem when playing with newer hardware. That’s why the turbo button exists on older computers. It allowed the user to switch to a slower speed so old games would be playable. Modern software uses a real time clock to keep time instead of the CPU.

Anonymous 0 Comments

We use a variable named delta. Which is the calculation of how long it took the frame to render. Most of the time it’s 1/60 of a second. We use that in everything movement based. So if your character is moving at 100 pixels a frame but your frame rate is only 1/24 so you multiply your movement by that. The formula normally looks like

Velocity = speed * input _vector * delta

In this speed is what you set your movement speed at. Input vector is what the player is pressing. This is between (-1,-1) and (1,1) and delta again is how long the frame took normally a value of 0.016 but does vary like if you were running at 24 fps your delta would be 0.0416

Anonymous 0 Comments

Unity enthusiast here.

This particular engine has something called deltaTime. This is a time that elapsed from last frame. Since video game is just a loop with one iteration every frame, multiplying some values by deltaTime makes them framerate-indepemdent.

If I write a code like: if W button is pressed, move 1cm forward, then this gives me speed of 30cm/second if game is running at 30FPS or 60cm/sec if it is 60FPS. Number of FPS = Number of cm per sec.

But I can also do: if W button is pressed, move (50 * deltaTime) cm forward. This will result in a constant movement at 50 cm/second, regardless of framerate.

Not every dev thinks about it, unfortunetely. E.g. Fallout 76 was purposely locked by devs at 30FPS. Movement was framerate-dependent, so when players modded the game and unlocked higher FPS values, they were zooming around the map few times faster.