eli5 Why does GPU heats and use fan to cooldown while processing 8k video on PC and there is no single fan in a TV and still process 8k/4k videos?

1.80K views

In short, why does the the TV doesn’t need a fan to cooldown and PC requires one while dealing with high definition videos?

In: 1359

44 Answers

Anonymous 0 Comments

I’m reading responses and most of them are misleading if i understood correctly your question. When you refer to your tv, you mean a smart tv, for example, opening an 8k/4k content from a pendrive right?. If that’s the case, you’re right, 4k/8k video is compressed. Internal SoCs (System on Chip, basically the CPU and lots of integrated components in it), usually have dedicated engines (circuits specifically designed to process that specific codec). Doing it using dedicated hardware is more energy efficient and thus, generates less heat. GPUs on the other side, have support for certain codecs, but it’s not oriented to do that, but playing games and stuff. Those formats not compatible with the decoding engines integrated in the GPU or CPU, need to be processed by software, using more energy, then generating more heat.

Anonymous 0 Comments

I don’t think “painting vs photographing a painting” is a terrible analogy.

A PC just PLAYING a 4/8K video (media file) would produce similar levels of heat and… with some work (because it’s different tech and the PC has all sorts of other things to worry about) could probably be happily passively cooled.

When the PC is “processing” 4/8K (as in, on a game), it’s not just showing a picture that moves a lot, it’s doing all the work behind drawing that picture to start with. Generally at 60 to 200 times a second.

Anonymous 0 Comments

Creating (encoding) an 8k video is much harder than playing (decoding) an 8k video. This is done on purpose, since once made videos are likely to be played many, many times. Playing back an 8k video on a PC should not generate much heat.

Anonymous 0 Comments

TV is playing back an already rendered and exported video file. A GPU is actually rendering (generating) the pixels. This takes a lot more power to do. Once the gpu renders something, you can capture it into a video file and play it back on anything for much less processing power.

Anonymous 0 Comments

It doesn’t, hardware accelerated video decoding isn’t particularly intense on the GPU. False premise.

Anonymous 0 Comments

TV is just like picture book. The picture already drawn and the TV just need to show it. GPU meanwhile is like coloring book, it need to color it and show it at the same time.

Anonymous 0 Comments

Soo many answers here that I wouldn’t understand if I was 5.

I’ll give a try.

A TV just displays images. A GPU paints them and prepares them for display. The GPU works much harder, so it is a lot hotter.

Anonymous 0 Comments

I assume “processing” means playing the video and not using something like handbrake to encode the video before playing it.

If your GPU heats up when playing an 8k video you either have hardware acceleration enabled on whatever video player you’re using along with other real-time post processing effects (think madVR), you have an older GPU that can provide HW acceleration for 8k videos, but the onboard ASIC is very inefficient and or has limited support for 8k videos so it heats up tremendously when used for 8k videos as a result, or the GPU has to encode/transcode a video file into an 8k format in real time that your gpu can provide basic HW acceleration support for (this can also be related to having an older ASIC on the GPU that is inefficient or has limited 8k video support). TVs don’t have this capability; they play whatever signal is sent to them as is.

TL;DR your GPU heats up playing 8k videos because you are using it to play the video, and something is telling it to do computations on the video file youre watching. TVs cannot do this.

Anonymous 0 Comments

Its like cooking at home(GPU) or eating out(TV). You skip the prepping and cooking process when eating out, its less work.

Anonymous 0 Comments

ELI5: If you ask an artist to draw a picture for you, it’s going to take a lot of effort and time. If you ask a photographer to show you the picture their camera just took (assuming digital camera), it’s going to be easy and only take as long as they take to pull the image up.

TVs only show you pictures that were already drawn, GPUs draw those pictures from scratch before showing it to you. To do this fast (because you want a moving video not a slideshow), they need a LOT of power. TVs can show you images pretty fast without using much power at all.