why the color accuracy of a display decreases over time and need to be recalibrated regularly?

233 views

also by how much it drifts (the drift rate) from the target?

In: Technology

Anonymous 0 Comments

Take a light bulb as an example,

After having it for a few years, you start wondering why it’s not as intense any more. Maybe you replaced a bulb in the fixture in the other end of the room and notice a difference in light intensity?

That’s because a bulbs *characteristics change with the operational hours.*

Use it, and it sort of changes colour a bit. And intensity. Until it’s worn out.

The same reasoning can *sort of* be applied to LED lights too. (Why it can’t, is really not ELI5, so disregard that.)

Modern screens have LED’s that produce the actual light. Somewhere behind the actual panel that produces the image, there are a handful of LED’s producing the light.

Those LED’s deviate from white to blue over their life span. or white to a kind of cream-like yellow.

That means that the static source of light that the screen uses to produce images, is not static. It changes over time. It’s change can be reasonably predicted (and some manufacturers even install sensors to *measure* the deviation!) and the screen can be taught from factory to assume a gradual change over time.

But. If it truly matters to you, professionally, that you get the colours Right^tm you need to calibrate your screen occasionally. Ensure that it does what it’s supposed to.

Help it by, literally, telling it how much it got it wrong.