Colors are a continuous phenomenon. The exact wavelength (or combination of wavelengths) of the thing you are looking at will register a particular color. Even a slight change in the wavelength means you’re looking at a slightly different color.
But computers do everything in discrete intervals. Say you measured the brightness of the red pixel from 1 to 10. The pixel cannot have a brightness of “5.5.” If you want more than 10 values, you can create a larger scale, but eventually you have to stop somewhere. The most common system in digital color uses 256 values for each color.
As you tick from one brightness setting to the next, you’re theoretically skipping over an infinite range of actual colors that occupy the space in between. In practice, this is rarely an actual problem. The RGB system can represent about 17 million colors, and one of them will be very close to the “true” color of interest. It’s mostly interesting as a piece of trivia or to people who are *really* interested in color fidelity across a range of digital and non-digital applications (i.e. Pantone and their corporate clients).
Latest Answers