There are 2 parts to your human self seeing (brain and eyes) ad 2 parts to the iphone seeing (hardware and software).
The hardware opens and closes a hole (this is for both) that physically lets in more or less light.
The software goes , hey, this is low light, lets switch to dark mode (rods in your eye and some other stuff in a camera lens) and open the aperture, but lets notify the software (brain or algorithmic, it is the same essentially, your brain has computational ability) to fill in the blanks of the grainy picture and say, hey, that is a door, even though the edges and not there and it is pixelated.
If you are in too high light,, your light sensors (cones in people, and basically the same in cameras bleach.
And all the calculations that let you see contrast and make adjustments are overloaded and can’t work.
If you have enough of that, you permanently damage the physical sensors (retinal, film, chip) as with snow blindness.
THe iPhone camera is a bit less good then your brain to deal with variations in light. So if there is a bright spot , like a super bright reflection, it may adjust to the brightest (or darkest) thing less well than your eyes do.
But there is a limit to what your eyes can do also.
Reflections of headlights are small and mess up your vision. People who work with lasers , the amount and of light is small but powerful, so they wear protection.
Latest Answers