I don’t know if I’m explaining this right…
A computer can run logic at some speed based on how powerful the components of it are, so if it can perform the logic of something, for example, movement in a game, how does it know how much should be done based on its power, instead of essentially running in “fast-forward” or conversely in slow motion?
In: 1307
So a computer keeps exact time via an RTC which is a fancy name for a vibrating crystal that is seperate from the cpu itself.
As a program is running, it has something called a “tick rate” which is basically the rate at which a particular loop is able to run. The exact amount of time it takes for that loop to take is calculated using that RTC. For things like videos or games, or anything else that needs to happen at a speed specific for human interaction, that loop makes sure that it only triggers the next step of something in line with the tick rate as adjusted by the RTC.
Basically the program goes “ok this tick took this long…and i know i need to wait only BLAH amount of time to show the next frame…ok…loop..go around 5 more times…but don’t do anything…and then come back to me”.
It’s called a game loop.
Think of it as a chess game. There is a set time (for example 1/60 frame of a second) that the chess players will switch turns. They don’t want to do it faster, so that the game speed will be consistent, and the players can interact with eacother in a predictable way (predictable lag).
The other player (the computer) will do multiple different things (render the things on screen, prepare possible interaction) in preparation for the other player (you). If they are quicker (more processing power), before when your turn starts, they can provide more information (more graphical frames per second, for example).
Then It’s your turn, and an internal ticker will move to the next “tick” in the loop, starting the process again.
Ps. Interesting anecdote. Space Invaders, a very famous (one of the earliest) game did speed up when you got to nearer to the end (less enemies on screen, no fixed loop, game responded faster). The developers considered it as a part of the gameplay loop then, to make the game harder at the end stage.
As an ELI5, the computer game knows what speed things need to work each second (e.g. how fast the care should look, or the footballer should run). And it also knows that we humans like a faster “FPS” which is the number of frames of video it can generate each second. So on a faster computer, you want the “things done per second” to look the same as on a slower computer, but the game will reduce other things (e.g. the FPS rate may drop from 90 to 20), or ask you to reduce things that allow it to work more simply (e.g. lower quality graphics).
to add to the many comments here.
A single loop of the game loop is generally called a “Tick”, back in the day 1 tick was unregulated, it ran as fast as the computer could execute the code.
But game dev nerds are very smart and realized pretty quickly that computers would get faster and are not consistent, so 1 tick then became regulated by the display refresh rate
to drawing on the screen easier 1 tick was equal to 1 monitor refresh, monitors were pretty 24 or 30 hz up till about a decade ago, when 60, 90, 120, 144 became normal.
So generally now most game engines now use multiple “tick” rates which are based on time, and set to speeds to make their job easier, the rendering tick uses the refresh rate, network tick uses the optimal network rate, and the core tick rate which regulates them all and handles input generally runs as fast as it can. These also tend to run in their own process (thread) which allows better utilization of your cores and multithreading. However thats hard to program as threads dont like to talk to one another, so engines still are single threaded.
The game’s logic knows how much time has passed since the last “frame”
“`
function move_people(seconds_passed, people) {
for each person of people {
person.move (speed * seconds_passed)
}
}
“`
If the game’s logic runs faster, this function will be run more frequently, however seconds_passed will be smaller for each run so for each call the people will move less than a slower running computer. as a whole the person will move the same distance.
Funny story. I had a floppy disk game from the 90s, it was sort of like a roller coaster tycoon. It was made to run on MSDOS. Well, one time I got it to work on a computer running Windows XP, and holy cow, the game ran in super fast turbo mode. Like, hysterically fast.
Anyways, every computer has a built in clock circuit. Modern games are made to sync to the clock, so no matter the performance specs of the computer, the game should run the same speed. A good test to confirm is to install a classic game, like Warcraft 3 or StarCraft, and see.
In general, the programmer tells it how fast or slow to run. I don’t know too much about this, but my guess is it’s somewhere in the CODEC description.
But essentially everything is done through code. In code, you can tell the computer “generate a frame (of video) and send it to the video card to pass to the monitor, then wait for 10ms, then generate and send the next frame”, and it will do that.
As for games, the idea is similar but slightly more complex. In addition to display-based tuning, you also have to do gameplay-based tuning. Let’s say you’re playing a game like Dark Souls, and the computer calculates your distance relative to its own internal speed. Then, if you hold the joystick forward for 1 second you could move hundreds of digital kilometers in the game. That’s obviously not ideal. So in addition to making the screen update at a speed that humans can process (or that the screen itself can process; I’m not going to get into refresh rate), the game itself has to have additional controls to make sure the game plays smoothly.
Games run in ticks. Small slices of time. In the past days of old, it’d be per frame. I.e. old Bethesda games, some bad console ports will run sped up if you uncap the frame rate.
Nowadays most engines define their ticks in seconds. I.e. 40 ticks a second, 100, bla.
In the super old past ticks used to be based on CPU clock cycles, but no longer.
What you’re thinking about was the thing in early days of personal computers.
The games were clocked to the main processor clock, basically, running as fast as they could.
When later the processor speeds increased suddenly older games became unplayable because they were now too fast.
So, a “Turbo” button was introduced – a physical button on the PC case that will reduce the clock speed by half (usually) making older games (and some other programs) usable again.
Since then programmers learned to clock the speed of the games to “real time” clocks so they run with the same speed on all kinds of computers (if they are powerful enough, of course).
Latest Answers