Recently Google had commercials demonstrating that their photos do a better job of accurately photographing people with darker skin tones. IIRC, a professor on Twitter also noted that Zoom’s background blur feature was blurring his Black colleague’s head (presumably because the AI thought it was an object?).
I’ve heard this also difficulty applies to pets like black cats and labs.
So, why are people with darker skin harder to photograph?
In: 0
Film and digital sensors have a certain response curve. The eyes and brain do a lot of processing. Film engineers and developers tuned it for light skin tones: https://99percentinvisible.org/episode/shirley-cards/ and https://www.npr.org/2014/11/13/363517842/for-decades-kodak-s-shirley-cards-set-photography-s-skin-tone-standard
This article explains it pretty well: https://www.wired.co.uk/article/inclusive-cameras-apple-google It touches on the metering method of an average 18% gray brightness target. This one says it’s a myth: https://www.allure.com/story/photographing-darker-skin-tones
Coded Bias on Netflix covers the AI issues, and it seems that it has to do with the training data the algorithms are made with. Blink detection algorithms throw false positives with Asian faces.
Photography is all about light. Darker colors are darker simply because they reflect less light. Cameras have always struggled with properly capturing dark (try taking pictures at night).
One method to better capture dark subjects is to keep the “shutter” open longer, so more photons reach the film/sensor. Two problems then arise, the first, that any motion in the camera will cause blurriness. Second, while a long exposure will help better capture darker subjects, it will overexpose lighter subjects (sky, white objects, etc). Its a balancing act.
One solution modern software driven cameras might take is to use a longer exposure, but “stop” the exposure on the brighter objects sooner, and allow darker subjects to be further exposed to the sensor.
Latest Answers