Imagine a video is a series of pictures, or frames.
A GPU is hand drawing (rendering) each individual picture in order, really fast and usually using math to do so.
Then all the hand drawn pages are stapled together into a book. This is actually creating the video.
A TV is taking that picture book and flipping the pages fast to create motion. This is playing back the video.
A TV only has to flip the pages presented to it, the GPU has to hand draw each one.
The GPU has to do a lot more work and thus consumes more power.
A GPU is also tiny compared to a TV. The actual brain doing work, the die, is super tiny. Like an inch x inch.
That tiny little thing can easily have 100-300+ watts of power coursing through it.
That means 100-300+ watts of heat inside a 1”x1” area.
You need a heat sink and fans to cool it or else it’s gonna destroy itself.
Meanwhile TVs can be small or huge. Like a 24” monitor or hell even a 100” TV.
Most of the power a TV consumes is on the panel, the actual display where the picture is.
A 24” monitor or 100” TV has a lot more surface area to dissipate heat.
Latest Answers