In addition to what others have said, CPUs are good at things like:
– Compare the coordinates of the bullet object and the opponent object.
– If they are the same, then:
* Read the score stored at a certain location in memory.
* Add 10 to it.
* Write the number back to the memory location where the score is stored.
* Look up the memory location where the start of the “show opponent dying animation” routine is stored.
* Remember what part of the program we’re currently at.
* Temporarily go to the “dying animation” part of the program we found earlier.
And so on, and so on, and so on. CPUs are really, *really* good at doing relatively complicated steps like each of the above. But because each step might have lots of nitty gritty details, they take a lot of work for the CPU to actually do them. (Read about [instruction pipelining](https://en.wikipedia.org/wiki/Instruction_pipelining) if you want to go down the rabbit hole of how complicated a modern CPU actually is behind the scenes).
GPUs can’t do anything nearly that complicated. Their “programs” are more like:
– Find the chunk of memory starting at a particular location.
– Add 3 to the first 1,000 numbers you find there.
Or:
– Here’s a list of 10,000,000 decimal number, like 2.3 and 4.7. Add each pair of numbers and divide them by 2, and put the results in another list. Oh, and if it lets you go a little faster to pretend that 2.3 is really 2.9999999987, go for it: raw speed is more important than perfect math here.
They can’t do things like make complicated decisions or jump around to just another part of their programming. They don’t have the circuitry to do that stuff. But those simple little instructions like I described? They’re smoking fast at those things, and at doing *a whole awful lot* of those simple little instructions at the same time. A CPU can do all the same things a GPU can, but it doesn’t have the circuitry for “do this one thing a gazillion times” kind of operations.
Or TL;DR:
– A CPU is like having a mathematician sitting at her desk solving hard problems.
– A GPU is like having a thousand kindergartners counting to 10 on their fingers, but all at the same time.
Latest Answers