Screens have millions of pixels and need to be updated at least 50 times per second. It is possible to connect a CPU directly to an HDMI cable (I have done that) but that doesn’t really leave much time for the CPU to do any other work.
For that reason computers have had dedicated graphics chips for a very long time. In the early days those were fairly simple chips that just shared memory with the CPU. The CPU would put instructions like “blue 8×8 pixel square goes here”, “Pac-Man goes there” into memory and then the graphics chip would send the right amount of electricity to the monitor at the right time.
These graphics chips have become more and more advanced and about 25-ish years ago were rebranded as GPUs. Nowadays they are quite generic and can run complicated calculations at phenomenal speeds.
Latest Answers