TL; DR: most cameras aren’t designed to capture all colors. RGB colors are an elaborate illusion and it doesn’t always work well.
On colors specifically: digital cameras and screens define every color as a combination of some amount of red, green and blue. The exact shade of “red” or “blue” it uses dates back to CRT TVs in most screens – the red we could do with the phosphors available in the day is the reddest it could get. This choice of “what is 100% red, and what is 100% blue” is called a colorspace, and screens and cameras can only represent the colors inside their color space.
Some real-life “reds” are redder than CRT phosphor reds, and we can’t photograph them and then show them on the screen. Other pure colors are slightly outside of the red-green-blue triangle, and we can’t represent them with any combination of RGB light.
HDR cameras and screens use more modern colorspaces (in addition to being able to distinguish more levels) – so a sunset looks better in HDR.
Another subtle thing is people have individual differences in color perception, so a mix of the same amount of red and green may look like one shade of yellow to me, and a slightly different yellow to you. If I make a camera that produces realistic colors to me, you might not agree at all.
Latest Answers