Why do computers need GPUs (integrated or external)? What information is the CPU sending to the GPU that it can’t just send to a display?

1.60K views

Why do computers need GPUs (integrated or external)? What information is the CPU sending to the GPU that it can’t just send to a display?

In: 219

41 Answers

Anonymous 0 Comments

To put this into perspective, a relatively low resolution monitor is 1920×1080 pixels. That is over 2 million pixels that need to be potentially sent 3 numbers (red, green, and blue values) for every frame. One gigahertz is 1 billion operations per second. Rendering 60 frames per second is 60 frames * 3 color values * 2 million pixels = 360 million operations per second — 1/3 of 1 GHz. Even further, graphics depend on tons of other operations like rendering, lighting, antialiasing that need to happen for every frame that is displayed.

It becomes clear that raw speed is not going to solve the problem. We like fast processors because they are more responsive, just like our eyes like higher frame rates because it is smoother. To get smooth, high frame rate video, we need specialized processors that can render millions of pixels dozens of times a second. The trick with GPUs is parallelization.

GPUs have relatively low clock speed (1GHz) compared to CPUs (3-4Ghz), but that have thousands of cores. That’s right, thousands of cores. They also use larger instruction size: usually 256 bits compared to CPUs’ 64 bits. What this all boils down to is boosting the throughput. Computing values for those millions of pixels becomes a whole lot easier when you have 2,000 “slower” cores doing the work all together.

The typical follow up question is “why don’t we just use GPUs for everything since they are so fast and have so many cores?” Primarily because GPUs are purpose built for the task they were designed for. Although that doesn’t prevent the possibility of general computing on GPUs, we humans like computers to be super snappy. Where CPUs can juggle dozens of tasks without a hiccup, GPUs are powerhouses for churning through an incredible volume of repetitive calculations.

PS: Some software takes advantage of the GPU for churning through data. Lots of video and audio editing software can leverage your GPU. Also CAD programs will use the GPU for physics simulations for the same reason.

You are viewing 1 out of 41 answers, click here to view all answers.