How can it be that films in 1000 fps are slow-motion but when you play a game and the fps varies from lets say 60-250 it doesn’t get “slower” the closer you get to the 250?

58 views
0

Because you would then assume that higher fps = slowmotion / slower image

In: 0

You are confusing recording speed and playback speed.

Normally, you want the recording speed and playback speed to be the same. If you record a video at 30 FPS, then you also want to play it at 30 FPS. This is what happens when you play a videogame – if the videogame is at 60 FPS, then the images are created at 60 FPS and played at 60 FPS.

If you playback at a slower framerate than the recorded frame rate, then the video is shown in slow motion. For example if I record a video at 60 FPS and they play it back at 30 FPS, then it would take two seconds to play each second of the original video.

When you see a “1000 FPS” video, it means that the video was *shot* at 1000 FPS, but is now played back at 30 or 60 FPS, resulting in slow motion.

You record the film at 1000 FPS but replay it at, say, 24 FPS. That makes it slow motion. In games you have to calculate motions and animation delays based on the target FPS. You usually do that calculation in each iteration of the game loop. Otherwise varying FPS would indeed result in varying motion and animation speed.

Slow-motion videos are recorded at 1000 fps and then played back at 24, 30, or 60 fps. As a result, the video looks like slow. The difference in recording and play-back is what makes it slow-motion. In case of video games, there is no “recording”, so it will only look like slow motion when the game wants to show you slow motion.

If a video is recorded at 1000 fps, and you play it back at 1000 fps, it won’t be slow-motion.

However, that’s not what happens. The point of recording at such high framerates is that you can then play it back at something like 60 fps, and see a smooth video in which time progresses much more slowly than in reality. For instance, if you played it back at 50 fps, then one second in real recorded time would be stretched over 20 seconds of playback time. This is how you get the slow-motion effect: things that took 1 second in real life now take 20 seconds.

You need the high recording frame rates for the video to look smooth when its played back slowly. For instance, say you recorded at 60 fps and then played the video back at 3 fps. Again, this would mean that 1 second of recorded time would be stretched over 20 seconds of playback time. Only now, every second you play back only contains 3 frames, so it won’t look smooth at all. Instead, it will look like a fast slide show. And you don’t really get the benefit of slow-motion allowing you to see fast movements slowed down, because anything that happened in between those 3 frames is either not captured or smeared out as motion blur in each frame.

When you play a game, the fps is simply how many frames are drawn every second. At 30 fps, the game world is rendered 30 times per second. At 200 fps, the game world is rendered 200 per second. But the rate at which time progresses in the game world isn’t changed. It’s just a matter of how many “glimpses” you get of the game world in each second. There’s no recording and playback. Or, if you prefer, you can think of it as recording the game world and playing it back at the exact same time, with the same frame rates for recording and playback.

Not sure but every movement is being adapted to the frame rate and therefore gives you a smoother experience.

The progress of a movement from image to image is smaller at higher frame rates yet it takes place in the same amount of time.
And in slow motion the higher frame rate is replayed in real-time with a standard Fram rate and therefore it seems to be slower.

If this makes any sense to you