eli5 Why does GPU heats and use fan to cooldown while processing 8k video on PC and there is no single fan in a TV and still process 8k/4k videos?

1.54K views

In short, why does the the TV doesn’t need a fan to cooldown and PC requires one while dealing with high definition videos?

In: 1359

44 Answers

Anonymous 0 Comments

assuming by “processing” you mean playing a video, then your GPU is probably broken or the manufacturer got dumb and didnt include fan-stop mode. so when doing very lightweight work, like playing a video, the fan still spins. both PC GPU and TV have special hardware that’s super efficient for playing videos.

the ELI5 is:

both TV and GPU have a person, let’s call him/her “Vidmaster” who is very smart/capable/efficient in playing videos. so this vidmaster can do the work required to play videos without sweating at all. So Vidmaster can work with comfort even without the air conditioner being turned on.

the problem is that some office (in your case, a PC) didn’t have the ability to turn off the air conditioning. so when vidmaster is working, he/she is stuck with the air conditioner turned on.

Anonymous 0 Comments

Video decoding can be done one of two ways, hardware(specialized hardware circuits) and software(raw computing power).

A lot of less powerful devices(Compute wise, so TV, cellphone, Roku and Amazon hardware boxes, cable boxes likely) are using hardware decoders, as in chips(or portions of a processor) made specifically to accommodate certain types of videos(or more accurately, their ‘codecs’).

Generally, hardware decoders are going to be for codecs that are older. Newer codecs are going to need to be ran on software.

It’s sort of like emulating old console video games. The old consoles were hardware based engines, specific circuits designed for specific data processing, that do little or nothing else.

It can take a lot of resources to emulate old hardware in software because CPU’s and GPU’s are made to be flexible, they can’t be as efficient as dedicated hardware unless they also have access to those hardware circuits….

Some computer components used to have some of these specialized circuits for older and very mainstream formats/codecs, but I don’t know if it is still the case on the very latest cards.

Apparently, something serving that dedicated purpose *is* on current gen cards from AMD.

_____

From here on is more expansion if anyone is interested in specifics or examples, sorry for the formatting, I can’t be assed:

https://www.amd.com/en/products/graphics/amd-radeon-rx-7800-xt

> Supported Rendering Format
> HDMI™ 4K Support
> Yes
> 4K H264 Decode
> Yes
> 4K H264 Encode
> Yes
> H265/HEVC Decode
> Yes
> H265/HEVC Encode
> Yes
> AV1 Decode
> Yes
> AV1 Encode
> Yes

Last gen’s 6500

> Supported Rendering Format
> HDMI™ 4K Support
> Yes
> 4K H264 Decode
> Yes
> 4K H264 Encode
> No
> H265/HEVC Decode
> Yes
> H265/HEVC Encode
> No
> AV1 Decode
> No

I don’t know if that’s because it is a lower end card, or the rate at which different codecs take to become mainstream. H264 has been around a very long time.

https://en.wikipedia.org/wiki/Advanced_Video_Coding
>Advanced Video Coding (AVC), also referred to as H.264 or MPEG-4 Part 10, is a video compression standard based on block-oriented, motion-compensated coding.[2] It is by far the most commonly used format for the recording, compression, and distribution of video content, used by 91% of video industry developers as of September 2019.[3][4] It supports a maximum resolution of 8K UHD.[5][6]

>Version 1 (Edition 1): (May 30, 2003) First approved version of H.264/AVC containing Baseline, Main, and Extended profiles.

Anonymous 0 Comments

A lot of the answers are just plain wrong. Here’s a simplest answer. GPUs need to be able to play 1,000 different games. TVs need to be able to only play 1 game.

Let’s expand on that…

* GPUs need to be able to play 1,000 different *3D* games.
* TVs only need to be able to play a single *2D* game.

Just having to do 3D math over 2D math increases the processing requirements exponentially.

* GPUs need to be able to play 1,000 3D games at any frame rate.
* TVs need to be able to play 1 2D game at a fixed 60 frames per second.

Life is so easy for a TV…

* GPUs need to play 1,000 3D games at any frame rate, and at any resolution.
* TVs need to play 1 2D game at 60 frames per second, and at a single fixed resolution.

A TV really isn’t going to break a sweat with the work it has to do.

* GPUs need to be able to play 1,000 3D games at any frame rate, at any resolution, and it needs continuously switch gears based on the user input.
* TVs need to play 1 2D game at 60 frames per second, and at a single resolution, but it only has 1 or maybe 2 gears which are not changed as frequently if at all.

So everything a TV needs to do is fixed, limited, and well known beforehand. When you have a fixed input and fixed output it is very easy to optimize it and bake it into the hardware. That is what is called hardware acceleration. The video and audio decoders are literally etched into the processor chip of the TV. GPUs on the other hand have no fixed input or output so it has to calculate a billion different things every second so it needs to be as beefy as possible.

Anonymous 0 Comments

Your GPU is doing hard math problems.
Your TV just read the answers provided by aforementioned GPU.