How does additive color mixing (RGB) on a monitor or LED light etc. simulate different wave lengths (frequencies) of light if it is just mixing different amplitudes of three discrete wave lengths?

217 views

Having a background in sound I am probably coming at this all wrong but if you mix a 1kHz sine wave with a 2kHz sine wave at various amplitudes you will get various different sounding composite sounds but at no point will you be able to emulate say, a 1300Hz sound. How is it that mixing Red Light at 462 terahertz (or whatever) with green light at 545 terahertz (these are numbers I am just pulling off google) at the same amplitude can result in a perceived frequency equivalent to 516 terahertz or as we know it ‘yellow’?

Is it that the ‘yellow’ we experience from Additive colour mixing is not the ‘true’ yellow we see in the rainbow? Is it our eyes that make up the colour based on the input of two discrete light sources interfering with each other?

In: 10

6 Answers

Anonymous 0 Comments

There’s no interference-pattern stuff going on, but you are correct: mixing red and green lights does not produce yellow light.

However, the way our eyes work is a total hack that we can exploit for fun and profit.

A light receptor that could report the full mix of frequencies hitting it would be complex and expensive and high-bandwidth; we simply never evolved anything that could do the job, and we’d be highly unlikely to ever do do.

Instead, each ‘pixel’ on the retina consists of a quad of different cells: one rod cell that just reports monochrome brightness, and three cone cells – one with a red filter, one with a blue filter, and one with a green filter. (there’s real physical blobs of pigment on each cell)

By comparing the relative brightness of the light reaching each of those cone cells, in combination with the overall brightness from the rod celll, your brain can infer the frequency of the light coming in.

(from an audio perspective, imagine having three different-sized baloons tuned to different points up the scale, playing a sound at them and recreating the input frequency from the relative amount of resonance on each)

If mostly the red is getting lit up, then the light must be red. If red and green, it’s somewhere in the yellow range, if mostly green then it’s green, if green and blue then it’s cyan, if mostly blue then it’s blue – and every variation along the way covers all the other colours.

However, you can totally fake those results – shine a red and green light together ferinstance, and it *produces the same response* as a yellow light does, and so you can’t tell the difference.

You could shine a redgreen light on a daffodil in a dark room – the light would look perfectly yellow, but the flower would look weirdly dark, because there’s no yellow light for it to reflect.

You can cover *almost* all of the visible spectrum this way, producing responses indistinguishable from the real thing, except for a couple of caveats:

* You can’t make a plausible orange. Go find the brightest, most vivid photo of an orange you can find on the internet, and compare it to an actual IRL fruit. The onscreen colour is sad and pathetic and dingy by comparison.
* On the other hand, magenta is completely fictional. There is no magenta on the spectrum; a wavelength that somehow hits the red and blue receptors at once, without lighting up the green in the middle… doesn’t exist. You can *only* produce that response by shining red and blue lights together – and that’s why the hue wheel wraps around; you’ve joined the two ends together round the back, as it were.

You are viewing 1 out of 6 answers, click here to view all answers.