Everything looks different to camera than to an eye. Perfect color reproduction doesn’t really exist in photography. The bayer masks in your camera, they don’t filter the spectrum exactly the same as cones in your eyes do. The display you are looking at doesn’t even emit all the same frequencies the object in picture was illuminated by.
The thing you are looking at most likely reflects light, but the screen emits light, it looks completely different, of course it does.
Your eye catches a signal, which is then encoded and sent to your brain, which interprets it. This interpretation may not be identical between brains, or between your brain and a device like a camera. (Where, of course, you still have to use your eye and brain to see.)
The signal is the same, but the algorithm that turns it into something legible like a picture – in your brain or in the camera – is not the same one.
This is a noticeable problem with color mixing LED lights as well. If the exact wavelength of red is in line with the red receiver in the camera, then the camera interprets that red as a much stronger source than it appears to our eyes. It is commonly seen by anyone trying to take phone video of a rock concert, any blue light on the stage is completely blow out on the video. I imagine it is a similar thing with gemstones.
Latest Answers