*TL;DR* We got to a point where computers could show more shades of color than most people’s eyeballs could tell the difference between. We just kinda stopped using it as an advertising bullet point.
The advertised bits are the “[bit-depth](https://en.wikipedia.org/wiki/Color_depth)” of the color space. Basically how many individual colors the computer can display. When we had 8-bit color you could give every pixel on the TV screen a binary number that was its color. anything from 00000000 to 11111111. This makes 256 numbers in total. [Here is a picture of all of them.](https://upload.wikimedia.org/wikipedia/commons/9/93/256colour.png)
Same with a 16-bit image. More numbers allows for many more colors with finer differences between them. This is not just a number twice as big, it’s twice as many digits long. This is 65536 colors. [Here is a picture of all of them](https://upload.wikimedia.org/wikipedia/commons/d/d1/RGB_16bits_palette.png).
At 24-bit color we start hitting the edge of what computers could do for most of their history. We’d like 32-bit color but we need a few of those last bits to carry other information. not to worry. even with 24-bit that’s 16,777,216 individual colors. [Here is a picture of all of them.](https://upload.wikimedia.org/wikipedia/commons/e/e9/16777216colors.png)
Around about this time the video settings on Windows just said “millions of colors” for the last option on the settings menu. After that, we did start using 30 bits for color, allowing for 1,073,741,824 colors. (no picture this time) We are at a point where most monitors can’t display such fine differences in color and 99.9999% of people couldn’t tell or wouldn’t care that much even if it could.
Latest Answers