They have enough hysteresis that they need a bigger increase in ambient light to turn off than they themselves create. For example, the light turns on when the sensor reads 10% and turns off when the sensor reads 20%, while it’s own light only changes the reading by 5%.
Or sometimes they don’t, and do get confused by their own light.
The simplist answer is that the sensors used to detect the light are sensitive to certain type of light. It’s direction, intensity, and frequency. The people who design the light ensure that the light that is being made and the type the sensor is sensitive too are different so it doesn’t cause a problem.
Your eyes do not detect brightness in an equal, linear fashion. As things get brighter, your eyes stop being able to detect changes in brightness as much.
What this means is that something that’s 10 units of brightness might look 10x brighter than something that’s 1. But in order to look 10x brighter again, you may need 1000 units or even 10000. Not 100.
The sun is really, really, REALLY bright. Thousands of times brighter than the brightest standard light bulb shining directly on them. Even on a cloudy day. It just doesn’t seem that way to our eyes.
Meaning these bulbs’ light sensors, which don’t have such limitations in their perception, usually have absolutely no problem telling the difference in brightness between the sun, and their own light.
LEDs actually emit much less light than other sources of light. This is one reason They’re so efficient. They literally just output way fewer photons.
However, LEDs can be fine-tuned very precisely to only output their light on wavelengths that the human eye is very sensitive to, which makes them seem very bright to us.
The simplest way to make sure they don’t detect their own light, is to just have the sense turn on when it detects a part of the spectrum that the LED is not making.
Latest Answers