Hi 🙂
The TV doesn’t receive high (loss based) compressed video (e.g. not MP4 like you’re used to when watching internet videos), and thus video signals need a larger bandwidth.
If you attach a blu-ray player, media stick, camera, you don’t want to add (more) compression artifacts.
Plus there’s always overhead, planning ahead for higher resolution standards.
To add to what u/schorhr said, the actual data rate for a 60fps 1080p video stream is somewhere around 4 or 5 gigabits per second through an HDMI cable. 60fps 4k is four times that.
If you’re watching a DVD/Blu-Ray or streaming video, the video source is handling decompression and each frame is sent over the cable as an uncompressed image. If you’re playing a video game then the video card in your computer or console is sending each frame uncompressed pretty much as soon as it’s done rendering it.
Because the data stream you get from YouTube or Netflix is compressed and needs to be processed to create an image. A 1080 TV has 2,073,600 pixels it must draw 60 times a second. Each of those pixels has has 3 color elements that require 10 bits of data to encode for. 2,073,600 pixels * 30 bits (pixel color value) * 60 frames per second = 3,732,480,000 bits/second. Divide that by 1,073,741,824 bits in a gigabit and you get 3.47 gigabits/second required to stream a 1080p video at 60hz
Latest Answers