Think of it this way.
The CPU is like a chef in a restaurant. It sees an order coming in for a steak and potatoes and salad. It gets to work cooking those things. It starts the steak in a pan. It has to watch the steak carefully, and flip it at the right time. The potatoes have to be partially boiled in a pot, then finished in the same pan as the steak.
Meanwhile, the CPU delegates the salad to the GPU. The GPU is a guy who operates an entire table full of salad chopping machines. He can only do one thing: chop vegetables. But he can stuff carrots, lettuce, cucumbers, and everything else, into all the machines at once, press the button, and watch it spit out perfect results, far faster than a chef could do.
Back to the programming world.
The CPU excels at processing the main logic of a computer program. The result of one computation will be important for the next part, so it can only do so many things at once.
The GPU excels at getting a ridiculous amount of data and then doing the same processing on ALL of it at the same time. It is particularly good at the kind of math that arranges thousands of little triangles in just the right way to look like a 3D object.
Latest Answers