Why do TVs not require graphics cards the same way that computers do?

754 viewsOtherTechnology

Let’s use Balders Gate as an example… the majority of the really “graphic” intensive parts of the game are the cut scenes and not the actual game. So why would it need a substantial GPU? Isn’t it just playing back a prerecorded video much like a TV would? Or am I thinking of this wrong?

In: Technology

14 Answers

Anonymous 0 Comments

You’re not thinking of it the right way.
A TV or Monitor is a display with a chip that handles HDMI to display conversion. That’s all it does, it never makes its own output, it just uses what it is given.
A PC or Game Console (Which is still a computer) takes the game’s files, like the scene and the characters and computes it into an image. This is called rendering the image, then that image is sent through the HDMI connector to the monitor or TV, which, as I said only displays it.
The cutscenes are never pre-recorded in modern games. If they were, devs would need to make them in many resolutions, with one file or each of them, a 2GB 1080p cutscene, a 10GB 4k cutscene, and all the in-betweens.
With rendering, you need the textures (4GB), the models (1GB) and the processing power.
It’s way better to have 5GB for 10 cutscenes(because assets can be reused), than to have 4 recordings of up to 10GB each for one single cutscene.

You are viewing 1 out of 14 answers, click here to view all answers.