Persistance of vision. How does it work?

660 views
0

I see a lot that 24 is the standard frame rate for movies and the like, but surely 12 nearly identical frames over a second would trigger PoV a lot better than 24 drastically different images. On that note; how can anyone determine that 24 should be the standard. Surely, action films, panaramas as similar video would need a higher frame-rate to look smooth, and slower films could get away with lower?

To relate this to games; games like the newer 2D Mario games stop looking choppy for me around 50fps, where first-person games still look choppy up to ~100.

In: Technology

Well, that’s not what happens in action movies. If something moves fast in front of a camera, it is captured on the film (or digital frame) blurred. The eye knows about blur and recognizes it as motion.

The sense cells in the eye are sensitive to flicker, and that is why your 12FPS constant scene wouldn’t count as persistent. It would look like flickering, and that’s frightening/annoying to humans.

Edison did a lot of flicker experiments, with different frame rates, before picking 24. Since then it’s seemed to be “good enough”. Experiments with higher frame rates (I’m talking about you Hobbit) have not shown that most people perceive it as “better”.

Games and PC CGI are completely different. The moving contents of frames aren’t motion blurred. This causes “stepping”. That’s also not something the eye likes. The high frame rates are necessary to blur the steps inside the eye.

12FPS would trigger POV a lot *worse* than 24, not better. Our brains only hold the images for a small period of time; you need to flash another image during that brief window for the persistence effect to manifest.

For film, the reason 24 fps became popular was that sound became a thing in movies. To edit film and match spoken voice to people seen speaking, 18fps was needed. 24 was chosen as sorta “safety buffer”. Earlier films were at lower fps.

All this is because in those times, film cost a lot of money. Any way to save it would produce big savings, so you wanted to shoot at minimum possible fps.

You can tell limits of fps by tracking a moving object on screen. So yeah, without movement you can track, difference between low and high fps fades. But films utilize motion blur to also hide their choppiness, if screen becomes just a total mess when something moves fast, well, it’s hard to tell what the fps is.

For games anyway, it’s distracting and not good if screen turns into a mess. You much rather want to actually be able to track things moving around your screen. So the cheat films use doesn’t bail games out nearly as well.