There are a few systems, but the easiest to understand is contrast detection. This is very common on any digital camera showing a “live view” of what the sensor sees. The camera searches for parts of the image where color or brightness differ and adjusts the optics until the edges of colors and brightness get sharper. This is why if you try to focus on a featureless area, e.g. snow or a cloud, it will keep hunting as it’s difficult to find any well-defined edges.
Since all this requires is firmware (you need the image sensor already…), this is the cheapest. Other systems use specialized, dedicated sensors so the system can compute how unfocused the light hitting the sensors is, and *predict* how to move the optics; this is why the autofocus on a nice dSLR camera will be very fast. This is called phase detection.
Latest Answers