How do HD photos and videos seem to look better than our own naked eyes?



HD photos and videos seem to look better than the pictures formed by our own eyes. How is it so? I didn’t know which flair would fit so I chose biology.

In: Technology

The eye is really good at some things but not so good at others.

Meanwhile, cameras can be good at what ours eyes are bad at, and that’s what really stands out.

For instance, really high dynamic range. Seeing pitch black right next to bright white.

In our eyes, this is weird because of the way light bounces and reflects. Plus our eyes are constantly adjusting to what we’re looking at.

A camera can capture the highs and lows all at once without those adjustments. Then, when we see it as a reproduction instead of actual light, our eyes are able to see it in that static state.

It’s not that it’s better, it’s just different from the way we normally see, and that makes it pleasant.

The number of colors available in a photo or video is smaller than what your eyes can see. Because of this, decisions are made to show the colors of the rainbow that make the images on devices look the prettiest or fits the meaning trying to be communicated.

Genuine question: have you had your eyes tested recently? When I first got glasses it literally felt like the difference between standard definition and HD screens. You often don’t really know what you’re missing.

The sun is pretty damn bright. The difference between the brightest sunlit and darkest shadowed areas is much greater than our eyes or cameras can capture at once, and even much more than what we can print. Our eyes and our video cameras deal with it by continuously adjusting the “size” of the iris, or rather the pupil/aperture hole in its center.

This usually means we can see clear in bright areas but not in shadows, or clearly see indoors or in shade surrounded by blinding light. This is easy to notice with cameras that have poor auto-exposure—when changing view to a brighter or darker area, the image stays washed out or muddy until it readjusts.

A high dynamic range (HDR) photo is altered so that overall contrast is reduced (the brightest and dimmest areas are closer together), while also preserving strong contrast on a smaller scale. Everything in the image is clearly visible, nothing too black or white. This is typically done by shooting a scene many times, once for the bright areas, at least once more for dark areas, and using image editing software to smooth things over.

Don’t get this confused with high-definition video (HD video), which is just either of two video formats larger than ‘standard definition’ but smaller than 4k video. HD video doesn’t account for high dynamic range without additional processing.

Some of the super-expensive video cameras (the kinds that studios can only afford to *rent*) can capture a larger range of contrast—more than what our monitors can show—so video editors can make the same HDR edits as photographers.