Why does HMDI need such high bandwidth compared to Ethernet?

771 views

When we stream 4k video from Netflix for example, they recommend a minimum speed of 25 mbps.But HDMI 2.0 has a bandwidth of 18.0 Gbit/s

How can we recieve 4k video with a connection speed of only 25 mbps when we need 18.0 Gbit/s to send the video to our TV?

I think it is to do with compressed vs raw 4k but do we really compress it to this an extreme? I would have thought this would result in so much loss that we are no where near 4k in the end.

In: Technology

7 Answers

Anonymous 0 Comments

>How can we recieve 4k video with a connection speed of only 25 mbps when we need 18.0 Gbit/s to send the video to our TV?

* Compression.
* You are downloading a map and a set of instructions that tell your device how to recreate each frame of the video.
* Once that frame is recreated, the data for over 8 millions pixels needs to get from your device to the display.
* Also up to 8 channels of audio, plus other data, and the HDMI spec also includes room for a 100mbps ethernet link embedded inside the HDMI link

Anonymous 0 Comments

Compression: the information sent over a video cable is essentially completely uncompressed video: an uncompressed image is being transmitted for each frame.

Anonymous 0 Comments

You stream compressed video over the internet. But then what happens? Your computer decompresses that video and sends whole 4k frames to the screen. As smart as your screen is, it’s supposed to be a relatively dumb output device, so that it can operate independent of whatever video format you’re sourcing, or how much computing power is required to reconstitute the original frame data.

Anonymous 0 Comments

Netflix, Youtube etc. all compress the hell out of their videos. Video compression is a lossy system, so some of the detail in the image goes away when it’s compressed–the critical thing with the compression algorithms is to try and ensure what goes missing is stuff you won’t notice. Nonetheless, if you were to run the uncompressed 4K video alongside the streamed version you would almost certainly be able to tell the difference.

As for the bandwidth requirements, 4K video is 3840×2160 pixels. You need 3 bytes to store each pixel, so each uncompressed video frame is nearly 24Mb of data on its own. If you’re streaming that video to HDMI at 60 frames per second, then you need a minimum bandwidth of a little bit over 11Gbps. HDMI gives you a fair bit of overhead there, but certainly not so much that you could say 18Gbps is overkill.

Anonymous 0 Comments

The speed for HDMI 2.0 is the max throughput, not the minimum throughput. Anything less than 18.0 gbps can be effectively transmitted through an HDMI 2.0 connection. 25 mbps is much less than 18 gbps (18 gbps is 18000 mbps).

Anonymous 0 Comments

4K video is 3840 × 2160 pixels with 3 colors and 8 bit per color.

So you need 3840*2160*3*8=199,065,600 bit for a single frame. At 60 frame per second that is 11,943,936,000 bits or 11 Gbit/s

The first 4K video at 60 Hz on Youtube I founds was https://www.youtube.com/watch?v=LXb3EKWsInQ it is 961.43 MB for 5:13 video

The amount of data per second is 961.43*1024^2*8/((5*60+13))=
25,767,000 bits per second. This includes audio so the video size is smaller

That is a compression range of 463:1 even if audio is included in one but not the other.

So the answer is compression that reduces file size but factors of a couple of hundred times.

Anonymous 0 Comments

Thanks for the replies, makes sense! I just find the fact that the uncompressed data is so much larger in size astonishing.