There is no clear line and it varies from person to person – it even varies in the field of vision of one person.
Our vision is continuous, like an analog signal rather than digital, interpreted by our fancy brains. At some point the single images “overlap” into one another enough to look like continuous motion, that’s roughly where cinema likes to sit (30ish fps), and we got so used to it that a movie in 60fps looks weird.
Going higher and higher, you get less and less of a difference. Ultimately, as a viewer, the higher you go, the more clearly you can see faster and bigger motion. As an extreme case: a bird that may be just partially on screen for one frame in 30fps could be going across the screen smoothly at a much higher frame rate.
Since you mentioned monitors, system latency and consistency of that plays a role too. While the reaction time of a regular person sits somewhere around 150-250ms, that has little bearing on the differences we can tell on continuously changing things. Even down to a single millisecond can be intuitively clear to us – that’s (partially) how we locate where a sound comes from with our ears.
Latest Answers