eli5 Why does GPU heats and use fan to cooldown while processing 8k video on PC and there is no single fan in a TV and still process 8k/4k videos?

1.34K views

In short, why does the the TV doesn’t need a fan to cooldown and PC requires one while dealing with high definition videos?

In: 1359

44 Answers

Anonymous 0 Comments

Different levels of computation go on and as a result, less heat is created by your TV.

It’s like comparing the time and effort it takes to do a math test by solving the questions (think GPU) versus looking up the answers and copying them down (think TV).

Anonymous 0 Comments

Computer gets hot while creating the graphics. Once created, it is easy to show them on tv or monitor.

Anonymous 0 Comments

Imagine a video is a series of pictures, or frames.

A GPU is hand drawing (rendering) each individual picture in order, really fast and usually using math to do so.

Then all the hand drawn pages are stapled together into a book. This is actually creating the video.

A TV is taking that picture book and flipping the pages fast to create motion. This is playing back the video.

A TV only has to flip the pages presented to it, the GPU has to hand draw each one.

The GPU has to do a lot more work and thus consumes more power.
A GPU is also tiny compared to a TV. The actual brain doing work, the die, is super tiny. Like an inch x inch.
That tiny little thing can easily have 100-300+ watts of power coursing through it.
That means 100-300+ watts of heat inside a 1”x1” area.
You need a heat sink and fans to cool it or else it’s gonna destroy itself.

Meanwhile TVs can be small or huge. Like a 24” monitor or hell even a 100” TV.
Most of the power a TV consumes is on the panel, the actual display where the picture is.
A 24” monitor or 100” TV has a lot more surface area to dissipate heat.

Anonymous 0 Comments

TVs generally are playing incoming signals as a display, and when they do frame interpolation it is hardware level not software level. A GPU rendering videos is outright assembling assets into a brand new file that needs to be written to a drive. The TV does not do that even when interpolating frames

Anonymous 0 Comments

Your tv is doing the exact same job as your monitor. Your monitor doesn’t over heat and need fans when displaying ultra hd video does it?

Anonymous 0 Comments

You’re confusing rendering vs playback/streaming

Rendering is using intensive graphical computations to create digital products like finished/edited videos, photos, etc. Another example is gaming. Rendering typically requires a lot of power and a beefy GPU Chip scaling with detail and depth (more complex math = more power intake and heat output)

Playback/streaming is already taking a rendered product and displaying it for consumption. This still takes a graphical processing unit to manage smart TV and some proprietary color/pixel management features of your TV.

To actually answer your question:

TVs actually do generate a good amount of heat in the form of the light (nits) that it outputs. Some are better than others but you typically dont need fans on a Tv because you are in a larger room (more passive thermal capacity where your AC can cool the room if it gets too hot and larger surface area unlike a computer)

Anonymous 0 Comments

TV just shows the video data that was already created and it just being passed to it. Video cards look at code and CREATE the video data real time. It’s that creation that is so computationally complicated. It requires a LOT of processing power, which generates a lot of heat.

Anonymous 0 Comments

For a little bit easier explanation. Think of a landscape and someone wants to put the image of that landscape on a screen. A TV/cable box/Blu-ray is like a photographer. The image is create elsewhere (in nature) and they just have to fine it to the display. The GPU is more like a painter where all the info is provided but the painter has to process what it sees and turn it into an image to be provided to the display.

Anonymous 0 Comments

um also the TV has a huge area behind whee it can put large aluminium radiators to cool down its chips

Anonymous 0 Comments

Common video codecs (for broadcast & streaming) are highly asymmetrical. That means it’s takes way more effort to create the stream (or file) than it does to play it. That’s why a TV doesn’t struggle to play content without getting too hot, while generating the content produces more heat.