Why does a computer need a dedicated graphics card for high-graphics video games, but can play 4K quality video without one?

437 views

Why does a computer need a dedicated graphics card for high-graphics video games, but can play 4K quality video without one?

In: Technology

16 Answers

Anonymous 0 Comments

There is a bunch of mathematics to take the 3D space and convert it to an image that goes to your monitor.

The most expensive process happens in the GPU, you pass all the objects vertices (mesh), textures, camera, and transform them using the MVP (model, view, projection matrix) in the GPU vertex shader.

Then the GPU does all that math and some magic with triangles, then returns you the resulting frame pixels with other relevant information. You take those pixels and calculate the lighting, shadows, after effects in the pixel shader (also called Fragment shader), once per each pixel.

This whole process in modern games runs several times to generate a single frame (we get 60fps and up nowadays). Every light that casts shadows has to be rendered and then mixed together in the final pass. Then you have post process pass like Bloom, anti-aliasing, ambient occlusion, color correction, etc

That’s what the GPU is for, it does hard math for each vertice, and then for each pixel that goes to the screen, several times. That’s why a GPU has so many cores, it’s a parallel job.

A GPU can’t process very good conditional logic. It’s engineered to do parallel math. The logic of the game, objects transformations, physics, etc, it’s all handled in the CPU.

You are viewing 1 out of 16 answers, click here to view all answers.