If all HDMI cables are basically the same design, pinout, etc. how have they been able to double, quadruple, etc. the bandwidth on them over time?

635 views

Going from HDMI 1.4 to 2.1 there is a 5x increase is bandwidth. Is it because the cables themselves were never the issue but it was the connectors/chips in the devices themselves that couldn’t handle it?

I know part of it is the actual quality of the cables themselves and tighter tolerances, more twists in the wires, material purity, etc. but I can’t imagine that alone would be enough to fully account for this.

In: 565

22 Answers

Anonymous 0 Comments

Okay, not specific to HDMI, but more communications in general: the sending and receiving side need to agree in advance on the amount of data they’re going to exchange (a specific standard is that agreement). You’ll commonly hear a pipe analogy, but in that analogy, the pipe (i.e. the phyical cable) is oversized, and the limiting factor is that the hardware on either side of the pipe is only set up to handle a certain rate of fluid. Lots of work goes into making sure that the pipe itself doesn’t really matter.

A less ELI5 answer would get into the tech used to support higher bandwidths on crappy cables (i.e. pre-emphasis and post-equalization), or into higher-order modulation schemes which allow more data to be put into a given bandwidth. Again, I’m not familiar with the ins and outs of the HDMI standard, so I don’t know exactly what tricks they’re using. But comms problems are comms problems everywhere.

You are viewing 1 out of 22 answers, click here to view all answers.