eli5 Why does GPU heats and use fan to cooldown while processing 8k video on PC and there is no single fan in a TV and still process 8k/4k videos?

1.53K views

In short, why does the the TV doesn’t need a fan to cooldown and PC requires one while dealing with high definition videos?

In: 1359

44 Answers

Anonymous 0 Comments

TVs generally don’t process (render) video. They just play it after some other device has done the rendering for them.

Anonymous 0 Comments

Most smart tvs do have heat sinks. The geometry of a flat tv allows for a huge surface area for cooling and so rarely needs fans to increase air flow into the heat sinks.

That being said, decoding and playing a large resolution video in itself is not a super demanding job for PCs either. The GPU is several times overbuild for that and there shouldn’t be any particular heat problems. Unless you have other high demanding graphic applications in the background.

Anonymous 0 Comments

A TV doesn’t process 4k videos it only plays/shows them. Which doesn’t require a lot of power.

Your pc doesn’t heat up when watching a YouTube video, because it isn’t rendering/processing it

Anonymous 0 Comments

Playing a video requires the computer/tv to turn a string of data into an image on the screen. Not trivial at high resolutions (so some old PCs or weaker laptops could struggle) but not truly challenging, because they were built explicitly to do that.

Rendering a video, particularly high resolution, requires a computer to CREATE that string of data, pixel by pixel (and the sound), based on all the inputs from the video editor. The more elements added in editing, the more complex the task.

To make some kind of analogy, it’s kind of the difference between just filming something live and projecting a tape, and doing a stop motion movie.

Anonymous 0 Comments

These days devices are equipped with special hardware that deals with videos created in a specific format. This hardware handles content with very fast and very little power. Think about phones, they can play 4k content for hours before running out of battery.
That being said, the process of playing video is simple. Only involves “decoding” which is simple/faster. Making videos i.e. “encoding” can be complex if you’re dealing with content that is either much bigger in size or is in a non standard format.

Anonymous 0 Comments

Your GPU isn’t heating up because it’s playing an 8k video, it’s heating up because it’s figuring out what pixels to put where in that 8k video.
Difference between displaying a painting in a gallery (your TV) and painting one by hand (your GPU) – one takes a LOT more work than the other. And with electricity, work = heat generated

Anonymous 0 Comments

Your tv is like your computers monitor. The monitor usually doesn’t have a fan blowing. It’s receiving the signal from the computer and just displaying it. In the case of a video game the computer is generating the graphics from the game code and user inputs. With your tv. Generating the image was done somewhere else and just transmitted to your tv. Your cable box does some work receiving and decoding the signal sometimes, depending on your setup

Anonymous 0 Comments

OP used “process” in an unfortunate attempt to mean play, now everyone is talking about rendering.

Anonymous 0 Comments

The TV shows you the picture it was sent as a string of 0s and 1s encoding almost directly which pixel on the screen gets what color.

The PC/GPU gets told something akin to “there’s a tree with 921368 leaves on the left, below it a couple is having a picnic, the sun on the right top is out but there are a few clouds around, a small family of 4 is having a stroll with their yorkshire terrier on a leash, and grass is everywhere else. Draw that, you have 1/60-th of a second. And don’t forget the shadows!”

Anonymous 0 Comments

TVs are designed especially to do one thing and do it well, while PCs are very much general purpose machines. A PC can do anything “computer” that you ask of it, a TV can’t. That’s why every now and then you have to throw out your TV, e.g. to go from SD to HD, to go from HD to 4K, to go from 4K to QHD, etc. It’s not just a case of changing the screen, the chips chosen just simply can’t do it. It’s why apps start to drop support for old TVs (because they can’t decode, say, h265 fast enough) and things like that.

Generally speaking you wouldn’t put a processor in a HD TV that was fast enough to decode 4K… it would be a waste of money and electricity. So you pick a processor that JUST ABOUT decodes all HD streams in real-time, and that will be a cheaper, less-powerful chip that doesn’t run as hot. Whereas your PC can do almost anything you throw at it, and just ramps up if it’s not going fast enough, hence the fan. The processor in your computer is probably worth more than some TVs are in their entirety.

Encoding and decoding are also two very different tasks. Decoding is basically following a series of instructions to recreate an image. There’s no “thinking” involved. But encoding is basically a puzzle to solve: What shortcuts can I take, what parts of the image are similar enough, where can I drop some information that the viewer won’t notice, in order to get this HUGE signal down into the specification of a HD file or broadcast, where at maximum I’m allowed only a certain number of bits per second to send the information. How can I formulate a puzzle that’s REALLY simple for the decoder, while giving them only the absolute minimum of information without sacrificing too much picture quality for THIS particular video? That’s a hard thing to do, making that puzzle, and requires enormous resources and very clever software. It’s why some videos are still horrible and blocky even though they are 4K – someone didn’t have the time, the processing power or the equipment to work out how to shrink that video even further without breaching the limit on bandwidth (which, if they did, the video wouldn’t be able to be broadcast or would be unnecessarily huge). Someone else, though, might have professional video encoding equipment, a very fast PC, better software, or just was able to leave it running longer and from the same source video they could make a smaller, higher-quality video stream of the exact same thing.