eli5 Why does GPU heats and use fan to cooldown while processing 8k video on PC and there is no single fan in a TV and still process 8k/4k videos?

1.35K views

In short, why does the the TV doesn’t need a fan to cooldown and PC requires one while dealing with high definition videos?

In: 1359

44 Answers

Anonymous 0 Comments

Think about rendering (encoding) a video like writing a thesis. It will need a lot of effort. The play (decoding) phase is like reading that thesis, it takes almost no effort.

Anonymous 0 Comments

In ELI5;

When you’re playing a game for example, your Computer builds the **entire world,** (This would be rendering the world )

Lets say, you’re in this world you just built and you take your phone/camera out and record a video in this world, now, you’re only see a part of this world through the video you taken right? like you can’t go into the world through a video recording.

Now, your phone isn’t building the world there, your computer is, the video you’ve taken, is only a representation of the world and not the world itself, this is alot smaller, as it is just a representation and much easier to process.

Your TV kinda works the same, it only really shows a representation of these worlds, not the full thing, even in 8k, it’s kinda just like a really good camera, it’s still a lot smaller than what would be necessary to actually make the whole world.

If you watch a YouTube video at 4k/8k, it doesn’t actually take a lot of power, this is kinda why, it’s just smaller.

Anonymous 0 Comments

It’s basically the difference between “Why does it take so much longer to draw a picture than it takes to look at a picture”

The TV does very little work/processing. There is some scaling and processing but it isn’t ‘generating’ an image. The GPU actually generates the image, and sends it to the monitor. The monitor is effectively a TV

Anonymous 0 Comments

I think most people aren’t even answering OPs question. Unless I’m mistaken, OP isn’t talking about rendering an 8k image like your GPU would when playing a video game for example, vs playing a pre-rendered video file.

The fact is, your TV does heat up. It has hardware dedicated to decoding video files (similar to what your GPU does when you watch Youtube videos on your browser with hardware acceleration enabled). If you notice a lot of TVs have gaps or grills in the back to allow for airflow so that these components can be cooled. You don’t hear fans because there are no fans in the TV They are typically designed to be very power efficient. Also, if they heat up past a certain point the TV will throttle and you might notice stuttering or a reduction in quality.

Anonymous 0 Comments

If you have an Nvidia gpu you might have super resolution turned on which uses a ton of GPU computation. I have to turn super resolution off if I want to play 4k AAA games

Anonymous 0 Comments

Someone else said you might mean ‘play’ in both cases, and the reason why a GPU spins fans while ‘playing’ a video is because most motherboards are configured so that GPU fans are always spinning. They are very high horsepower devices, so they consume a lot of power even when they don’t need to do very much.

TVs also get hot, but can dissipate that heat better passively because they have a big area to shed heat behind it and in front of it. They are kinda shaped like a radiator fin.

Anonymous 0 Comments

Had a TV that started to malfunction intermittently (Vizio don’t remember exact model). I bought it from Costco and when I called about fixing it they just sent me a new one. So I googled it and saw that it was actually common problem that the processor could overheat melting some of the solder causing a short. Solution was to literally pull the board and put it in a toaster oven at 400 for about 10 minutes to reflow the solder. Fixed the problem and I put a little 12v muffin fan behind the processor and it has worked just fine since.

Anonymous 0 Comments

Professional editing software actually uses the CPU more unless you are using GPU accelerated special effects.

GPUs are good at things like floating point calculations that are like rough math that the cpu struggles with. Cpu likes integers or something. It’s mostly done to like draw shadows and 3d models in games where the computer is producing every little detail it needs to offload a bunch of very very easy math problems that the cpu is inherently kind of bad at. Floating points are simple decimals, not exact. So 1/3 is .3 in floating point logic. That’s bad for encoding video files but great for casting shadows and scattering light naturally. Tho yes a modern gaming gpu meant for alot of floating point math in your games or whatever is going to do its job of putting stuff on your screen fine.

Rendering pre recorded clips and audio into a file are actually what a cpu is really good at. A gpu is not meant for precise calculations. Having a dedicated GPU is not going to improve performance on things that are easily handled by integrated graphics, even if they are handled by the dedicated GPU instead of your integrated gfx in cpu.

The only computer with like a powerful GPU are exclusively for gaming or very niche work. Not even all photo or video work needs a powerful GPU. They seem to be used just as much for other things too like mining BTC.

Anonymous 0 Comments

The tv is like (or identical) to your computer monitor. All it does is show pictures based on the signal it is given. Generating the signal is the part that requires computation and therefore generates heat.

Anonymous 0 Comments

Because digital compression algorithms are designed to be *easy to decode*, computationally speaking, at the cost of being hard to encode.