eli5: Why is it that 10 years ago when a (at least avg for its time) camera took a photo of a computer, or TV screen it would have weird lighting issues in the photo making the screen extremely bright and difficult to see the text but now even cheap phones take pictures of screens just fine?

211 views

eli5: Why is it that 10 years ago when a (at least avg for its time) camera took a photo of a computer, or TV screen it would have weird lighting issues in the photo making the screen extremely bright and difficult to see the text but now even cheap phones take pictures of screens just fine?

In: 3

2 Answers

Anonymous 0 Comments

If you’re talking about a CRT monitor or TV, the image is created by the electron beam scanning across one row at a time, so if you don’t synchronize your shutter and match the exposure time to the refresh of the monitor, you’ll likely pick up the refresh somewhere in the middle, and the illuminated part will be whatever parts of the monitor were scanned while the shutter is open.

Newer monitors usually have the old image displayed and illuminated for the full duration of a refresh, and just directly transition to the next frame, so there’s not a huge change in brightness where the screen is being updated unless the image itself is getting a huge change in brightness. You will likely see artifacts if you take a photo of an LCD alternating between fully black and fully white.

You are viewing 1 out of 2 answers, click here to view all answers.