Why do TVs not require graphics cards the same way that computers do?

652 viewsOtherTechnology

Let’s use Balders Gate as an example… the majority of the really “graphic” intensive parts of the game are the cut scenes and not the actual game. So why would it need a substantial GPU? Isn’t it just playing back a prerecorded video much like a TV would? Or am I thinking of this wrong?

In: Technology

14 Answers

Anonymous 0 Comments

Video games have to render graphics on demand because you can’t predict what the players are going to do, or where they are going to go.

While a TV just plays a recording, there is no rendering required. All you have to do is play what’s in the signal.

Anonymous 0 Comments

You’re thinking of it wrong. Haven’t played BG3, but I’m like 99.9% sure that it does not have pre-rendered cutscenes. Pre-rendered cutscenes mean EVERYTHING has to be the same every time, and take up an absolute ton of storage space, while rendering in engine gives you way more flexibility and isn’t jarring by being different quality. Although these days they often do up the graphics quality in cutscenes if they’re close ups, because if there’s less on the screen it can be higher quality.

You’re wildly underestimating how graphically intensive games are.

TVs, particularly 4k TVs, *do* have to do a lot of intensive calculations to decode video. However, because it’s a specific task that they do a lot, they have specific hardware in them to decode video, rather than large general purpose graphics chips. That specific hardware is orders of magnitude faster than general purpose chips at it, and that’s why TVs can play 4k video smoothly even with comparatively very small, low power chips.

Anonymous 0 Comments

You’re thinking of this wrong,the game is getting pretty graphically intensive outside of cutscenes, and the cutscenes are rendered in engine in that game as well so you’re actively rendering each frame, not watching a video with playback.

Anonymous 0 Comments

No, cutscenes are not video playback. They are rendered fully, since your characters appears in the cutscene with their equipped armors, unique facial designs etc, making it impossible to be a pre-recorded video. It’s real time rendered. The part about the TV requiring a GPU makes no sense. TVs are playback devices, not rendering devices. THey simply display what has been rendered on another device, like a console or PC.

Anonymous 0 Comments

Software like a video game tells the GPU “hey, draw this picture for me” which let’s the GPU draw (render) whatever the image is.

For a TV, the software (or hardware) takes a picture that has already been drawn and says “hey, hold this picture up for me”, and the TV displays the image.

Drawing the picture takes a lot longer than holding up to display.

Anonymous 0 Comments

All of BG3’s cut scenes are actually rendered in real time by the game, not pre-recorded. Also, your game console itself *does* have chips equivalent to a PC’s video card. The TV itself is basically a monitor.

Anonymous 0 Comments

ELI5:

PC, if it was a human child:

Stand here and let’s work on multiplication.

TV, if it was a human child:

Stand here and repeat after me.

One has to think a lot more than the other.

Anonymous 0 Comments

You’re not thinking of it the right way.
A TV or Monitor is a display with a chip that handles HDMI to display conversion. That’s all it does, it never makes its own output, it just uses what it is given.
A PC or Game Console (Which is still a computer) takes the game’s files, like the scene and the characters and computes it into an image. This is called rendering the image, then that image is sent through the HDMI connector to the monitor or TV, which, as I said only displays it.
The cutscenes are never pre-recorded in modern games. If they were, devs would need to make them in many resolutions, with one file or each of them, a 2GB 1080p cutscene, a 10GB 4k cutscene, and all the in-betweens.
With rendering, you need the textures (4GB), the models (1GB) and the processing power.
It’s way better to have 5GB for 10 cutscenes(because assets can be reused), than to have 4 recordings of up to 10GB each for one single cutscene.

Anonymous 0 Comments

Rendering a 3d environment, applying high resolution textures, shading that environment correctly, etc. etc. can be **very** computationally expensive. A TV outputting what was already recorded on the camera doesn’t have to do as much thinking, as the “scene” is already rendered because it’s just real life. You don’t realize or appreciate how difficult it is to replicate a real 3d environment until you try to do it yourself. There is SO MUCH you have to get right to get a realistic picture.

Anonymous 0 Comments

A TV only displays 2-dimensional pictures. That is really, really simple — we figured out how to do this with purely electrical circuits long before computers as we know them were a thing.

A PC/game console does that, too, but it creates the pictures first, from 3-dimensional data. That is the expensive part. Baldur’s Gate is not like a movie. You can tell by changing your graphics setting: It will change the way that cut scenes look, as well as everything else — so it must all be rendered in real time.

edit: Today’s TV signal often comes in as a digital stream, so TVs include relatively weak computers to decode that. But it remains a constant stream of pre-rendered 2-dimensional pictures. Your PC can do this without breaking a sweat.