If all HDMI cables are basically the same design, pinout, etc. how have they been able to double, quadruple, etc. the bandwidth on them over time?

799 views

Going from HDMI 1.4 to 2.1 there is a 5x increase is bandwidth. Is it because the cables themselves were never the issue but it was the connectors/chips in the devices themselves that couldn’t handle it?

I know part of it is the actual quality of the cables themselves and tighter tolerances, more twists in the wires, material purity, etc. but I can’t imagine that alone would be enough to fully account for this.

In: 565

22 Answers

Anonymous 0 Comments

Think of it like increasing the amount of written information on a line of college ruled notebook paper. The number of lines on that paper stays the same, that’s the number of wires In the cable.

Your hand is a sender
The pen is the cable
And the paper is a receiver

(In reality, both devices send and receive and all the lines are being written on simultaneously but try not to waste a visualization on that)

Another part of receiving device that we’ll call “the reader” takes the paper and makes sense of what was written, it then relays this to the computer.

In terms of computer signals, adding more letters to the alphabet you’re using is extremely effective as you exponentially increase the amount of information it can contain. The problem with that is the letters start looking similar to others so you have to be more accurate when writing to make sure it can be read properly. The letters are the individually recognizable signals coming from each wire.

There IS a limit to how small the pen can write before the ink bleeds enough to make it hard to read(that’s the cable) but that’s usually not the bottleneck since cables are pretty simple.

There’s a limit to how fast/accurate you can move your hand and how much the texture of the paper will impede those finer hand movements, that’s the hardware that gets upgraded the most over time as computer technology advances.

Now as these components get better and better, we’d stay the same, still running on HDMI 1.4, the same alphabet and writing at the same max speeds so no real change. That’s because we are following the last set of instructions we were given, software. Everything has to be in place before we can make use of ANY of the benefits. We can’t just speed up just because we are capable, and we definitely can’t start using new letters out of the blue and expect the computer to understand, the computer will look at it and go, “What? Wait… too fast, and WTF are Wingdings, nope, denied, I’m not accepting that, HDMI slot, whatever you’re doing, I refuse to cooperate, go back to what you were doing before”.

That’s where the software upgrade comes in, HDMI 2.1 software tells the the computer how to make sense of the new letters, to expect paper drops to happen more quickly, and it goes “okay, I understand now, what’s this called.. 2.4? Okay, I’ll herby allow all the 2.4 capable thingies connected to me to talk to me now”. And if everything else is in place, that’s what happens, the hand and paper are allowed to do their new thing.

The alphabet bit might be a little confusing since computers use various alphabets in code but in this instance, there’s a range of electrical signal available (let’s just use 0-10volts though I’m not sure what it actually is) and we have to divide that range into a set of recognizable readings (like 0-1volts,1-2volts,3-4… *totaling 10*) there has to be space between because there’s a lot of things that can make an initial 4v signal be received on the other side with a change, maybe it’ll be read at 3.4v instead. As components get more accurate, that potential variance gets smaller, maybe new designs are accurate to 0.5v or something and we then gain the opportunity to divide that range into smaller chunks with an upgrade without the risk of overlap(adding more “letters”, 0-0.5v, 0.5-1v -1.5… *totaling 20*). The key is timing the upgrade to make the best use of all the advancements without waiting too long between or jumping the gun before you could have had something better. It’s a process though because, like I said, everything needs to be in place before you reap any benefits so a large investment went into HDMI 2.1.

There’s a lot more going on with hardware, it’s not just “more innate accuracy” that new software makes use of, new chips are designed for the impending change and are installed in parallel with a chip that reads it the old way(maybe one that does both) and the data gets routed through whichever data method is relevant, determined by a digital handshake that verifies the type.

You are viewing 1 out of 22 answers, click here to view all answers.