Eli5: why can’t phone cameras catch the colors of sunrise and sunset?



I’ve noticed that whenever I try to take photos, specially of the sunrise, the colors come out really different from what I’m seeing (pinks and purples turn into orange and green?), It’s not just my current phone either, I feel like no phone actually captures things like this!

It obviously happens with a lot of other things, so I was wondering if there was an explanation!

In: Technology

I believe the dynamic range is not ideal to capture the difference in light / color with the same detail you see with your eye.

This produces a very blown out image versus anything with detail with looking at. In some cases you can focus on a darker part of the landscape and take the photo allowing the camera capture just the light. 🤓

Phones calculate exposure to make the frame uniformly well lit. That’s going to significantly overexpose the lit areas, and shift colors when some pixel filters are saturated.

Check out manual exposure control for your phone. You might be able to tap the screen and say “use this point for exposure control”, allowing the dark regions to go dark.

It’s automatic color balancing, you need to turn that off if you want accurate colors.

The longer version is many camera apps, especially on cell phones, look at a picture as a whole, and see if there is a tint there (like everything is tinted red, because there is a red light in the room), and then they subtract the guessed tint to make it look like that red light wasn’t there and it was white. It helps getting orange and red tints out of indoor pictures where you have off white lights. The problem is when you try to take a pic of a tinted light, like a red sun, it subtracts out the color and screws everything up.