I haven’t seen this posted yet – it’s the refresh rate. It has nothing to do with the amount of pixels, it just happens that most 4k TV sets have a higher refresh rate than 48 hz.
The magic number is around 50-60 frames per second. Below that threshold, your eyes think they are looking at a still image, so your brain doesn’t update the image as often, which means it doesn’t notice all the details.
Above that rate, your eyes start to notice that the image is changing fast. Due to the physics of how your eyes work, the Cone structure in your retina waves back and forth at around 60 hz. Each cone has several sensors, or pixels, that detect light, and these pixels basically report when they see a change. Each wave is basically one scan, so each cone is sending signals to your brain at that rate. If the cone sees a difference on its next pass, it sends a signal. If not, nothing happens, the cone stays quiet and waits.
In a 24 fps movie, the frame has not changed by the time your eyes make a second “pass.” The cones don’t see any difference, it’s the exact same image and nothing has moved or changed. So, those details don’t get highlighted. Ironically this results in your brain ignoring the problems so you don’t notice the cheap set, or the makeup, or the fake clothes.
At a higher refresh rate, the cones see different things every “wave,” and this allows a higher resolution. More cones are being activated because of the higher refresh rate, and this allows your brain to pull more data out of the image. Now suddenly the cheap fabric looks gaudy and tacky.
You have thousands upon thousands of cones doing this all the time. Your brain (as well as your eyeball itself) are using Stochiometric Resonance to build the image. By taking multiple copies of a noisy signal and layering it atop itself, you can then remove the noise and you are left with the signal.
Your brain does that. It takes thousands of inputs, decides what to ignore and what’s real, then passes the completed, processed signal to the next part of your brain. It’s just a matter of chance and biology that some footage is below this threshold and some is not. Below the threshold, your brain fills in the details and shows you what it thinks it should be. Above that, your eyes catch enough detail that your brain handles it differently.
Source : an Ars Technica article I read a few years ago and haven’t been able to find since
Latest Answers