Why do cameras need to do so drastically change brightness to make clear photos, but our eyes don’t need that?

244 views

So for example, if you try to capture a bright area with a camera, the other surrounding area can become too dark due to the camera “dilating” so that the bright area is not too bright, but our eyes can look at it fine.

Our eyes dilate too, but it’s not nearly as much as a camera.

In: 0

6 Answers

Anonymous 0 Comments

There are 2 parts to your human self seeing (brain and eyes) ad 2 parts to the iphone seeing (hardware and software).

The hardware opens and closes a hole (this is for both) that physically lets in more or less light.

The software goes , hey, this is low light, lets switch to dark mode (rods in your eye and some other stuff in a camera lens) and open the aperture, but lets notify the software (brain or algorithmic, it is the same essentially, your brain has computational ability) to fill in the blanks of the grainy picture and say, hey, that is a door, even though the edges and not there and it is pixelated.

If you are in too high light,, your light sensors (cones in people, and basically the same in cameras bleach.

And all the calculations that let you see contrast and make adjustments are overloaded and can’t work.

If you have enough of that, you permanently damage the physical sensors (retinal, film, chip) as with snow blindness.

THe iPhone camera is a bit less good then your brain to deal with variations in light. So if there is a bright spot , like a super bright reflection, it may adjust to the brightest (or darkest) thing less well than your eyes do.

But there is a limit to what your eyes can do also.

Reflections of headlights are small and mess up your vision. People who work with lasers , the amount and of light is small but powerful, so they wear protection.

You are viewing 1 out of 6 answers, click here to view all answers.