Let’s use Balders Gate as an example… the majority of the really “graphic” intensive parts of the game are the cut scenes and not the actual game. So why would it need a substantial GPU? Isn’t it just playing back a prerecorded video much like a TV would? Or am I thinking of this wrong?
In: Technology
You’re thinking of it wrong. Haven’t played BG3, but I’m like 99.9% sure that it does not have pre-rendered cutscenes. Pre-rendered cutscenes mean EVERYTHING has to be the same every time, and take up an absolute ton of storage space, while rendering in engine gives you way more flexibility and isn’t jarring by being different quality. Although these days they often do up the graphics quality in cutscenes if they’re close ups, because if there’s less on the screen it can be higher quality.
You’re wildly underestimating how graphically intensive games are.
TVs, particularly 4k TVs, *do* have to do a lot of intensive calculations to decode video. However, because it’s a specific task that they do a lot, they have specific hardware in them to decode video, rather than large general purpose graphics chips. That specific hardware is orders of magnitude faster than general purpose chips at it, and that’s why TVs can play 4k video smoothly even with comparatively very small, low power chips.
Latest Answers