Why can’t HDMI 2.0 support 4:4:4 RGB HDR?

805 views

If HDMI 2.0 supports 18 Gbps, why is it said to “lack” the bandwidth necessary to stream 4:4:4 RGB HDR video? I know Bluray is 80mbps, and that’s hardly anywhere near the 18Gbps?

In: Technology

4 Answers

Anonymous 0 Comments

4k video at 60fps with a color depth greater than 8 bits per channel requires at least 20gigabits per second to transmit an uncompressed video stream. This exceeds the bandwidth specified by hdmi2.0

Anonymous 0 Comments

Why would Blu-ray be 80Mbps? I think you might be confusing compressed bit-rate (i.e. the speed the compressed data comes off the disk) versus uncompressed (i.e. the speed at which video data has to be fed to the TV to show without skipping a frame).

1080p, RGB at 24 fps progressive alone is 1.19 Gbps.

[https://toolstud.io/video/bitrate.php](https://toolstud.io/video/bitrate.php)

Anonymous 0 Comments

The Blu-ray is compressed.

There are a variety of ways to compress video but a simple explanation is you have a starting picture and each frame after that can be generated only knowing the *changes* from the prior frame.

With advanced compression techniques, if every individual part of the picture changes in some way you might still be able to record this in a simple way like “the whole screen gets darker” or “the whole scene moves to the right” (due to the camera moving) and the next frame can be generated without needing detail about all those individual pixels…

But reading this limited information and translating it into a picture requires a lot of processing power or a processor specialized for this sort of work. A “smart TV” could have this built in, but the HDMI protocol assumes all the decoding is happening in the computer or Blu-ray player. HDMI is configured to send uncompressed video.

Uncompressed video requires 8+ bits of color information (more for HDR) times 3 colors, for every one of *millions* of pixels on the screen (just over 2 million for 1080p, 8+ million for 4k) for every frame each second (often is 60)… that’s a ton of data. Much more than the compressed Blu-ray contains!

Anonymous 0 Comments

80mbps for Bluray is a compressed data stream, not an uncompressed raw video that you compare it to.

If you have 4K 60FPS 8 bit RGB calculation the bit rate is trivial

4K is 3840 × 2160 pixels where each has 3 colors and 8 bits per color. All these 60 times per second. So you have 3840*2160*3*8*60=11 943 936 000 or 11.9 Gbit/s

HDR is 10 or 12 bits so you get 14.9 Gbit/s for 10 bits and 17.9 Gbit/s for 12 bits

HDMI has 18.0 Gbit/s link speed but you have 8b/10b encoding for the video data so it can be transmitted even if a large part is just black that is just 0 and white that would be just 11. You need to encode it som way to get bits that are changing. So the video band with of HDMI 2.0 is only 18/10*8=14.4 Gbit/s

So the number you should compare to is 14.4 Gbit/s, not 18 Gbit/s. So even 10 bit need more bandwidth then is available.

Another part is the number does not include any audio data that you also get out on the same link from game consoles etc.