Why do cameras still need ND filters?

1.24K views

Obviously it was a thing for film cameras, but now that everything is digital, something like “just make the picture darker” seems extremely easy to do with software

quick edit, I know what ND filters are for and how to use them, no need to explain. it just seems to me that it could be engineered in a way that doesn’t require them, which is what I’m asking about

In: 181

26 Answers

Anonymous 0 Comments

About as ELI5 as I can make it:

Digital camera sensors effectively collect light, transform it into electrons, and collect those electrons in a “bucket” for a certain time (the exposure time) waiting for the image to be readout. If you have a very low light level, and you have an almost empty bucket, you can amplify the signal and the limit is set by the noise, which is greater if you amplify more. But if you have a lot of light, the limit is actually the size of the bucket: once it’s full, it “overflows” and you read it as being a “clipped white”.

If you want a long exposure (for creativity or aesthetic reasons: maybe you want some motion blur), but the scene is too bright, you have to use an ND filter to reduce the amount of light hitting the sensor to avoid overflowing the bucket (or in some cases, you could also take multiple pictures and merge them in software).

Now, why can’t they make the bucket bigger? A single pixel contains both the bucket and the photodiode (which converts light into electrons to fill the bucket), so for a given size of the pixel, if you want a bigger bucket, your photodiode becomes smaller and your pictures will be worse when the scene in very dim.

In reality, sometimes the photodiode IS also the bucket, but things can get pretty complicated pretty quickly…

You are viewing 1 out of 26 answers, click here to view all answers.