I’m reading responses and most of them are misleading if i understood correctly your question. When you refer to your tv, you mean a smart tv, for example, opening an 8k/4k content from a pendrive right?. If that’s the case, you’re right, 4k/8k video is compressed. Internal SoCs (System on Chip, basically the CPU and lots of integrated components in it), usually have dedicated engines (circuits specifically designed to process that specific codec). Doing it using dedicated hardware is more energy efficient and thus, generates less heat. GPUs on the other side, have support for certain codecs, but it’s not oriented to do that, but playing games and stuff. Those formats not compatible with the decoding engines integrated in the GPU or CPU, need to be processed by software, using more energy, then generating more heat.
I don’t think “painting vs photographing a painting” is a terrible analogy.
A PC just PLAYING a 4/8K video (media file) would produce similar levels of heat and… with some work (because it’s different tech and the PC has all sorts of other things to worry about) could probably be happily passively cooled.
When the PC is “processing” 4/8K (as in, on a game), it’s not just showing a picture that moves a lot, it’s doing all the work behind drawing that picture to start with. Generally at 60 to 200 times a second.
I assume “processing” means playing the video and not using something like handbrake to encode the video before playing it.
If your GPU heats up when playing an 8k video you either have hardware acceleration enabled on whatever video player you’re using along with other real-time post processing effects (think madVR), you have an older GPU that can provide HW acceleration for 8k videos, but the onboard ASIC is very inefficient and or has limited support for 8k videos so it heats up tremendously when used for 8k videos as a result, or the GPU has to encode/transcode a video file into an 8k format in real time that your gpu can provide basic HW acceleration support for (this can also be related to having an older ASIC on the GPU that is inefficient or has limited 8k video support). TVs don’t have this capability; they play whatever signal is sent to them as is.
TL;DR your GPU heats up playing 8k videos because you are using it to play the video, and something is telling it to do computations on the video file youre watching. TVs cannot do this.
ELI5: If you ask an artist to draw a picture for you, it’s going to take a lot of effort and time. If you ask a photographer to show you the picture their camera just took (assuming digital camera), it’s going to be easy and only take as long as they take to pull the image up.
TVs only show you pictures that were already drawn, GPUs draw those pictures from scratch before showing it to you. To do this fast (because you want a moving video not a slideshow), they need a LOT of power. TVs can show you images pretty fast without using much power at all.
Latest Answers