If all HDMI cables are basically the same design, pinout, etc. how have they been able to double, quadruple, etc. the bandwidth on them over time?

753 views

Going from HDMI 1.4 to 2.1 there is a 5x increase is bandwidth. Is it because the cables themselves were never the issue but it was the connectors/chips in the devices themselves that couldn’t handle it?

I know part of it is the actual quality of the cables themselves and tighter tolerances, more twists in the wires, material purity, etc. but I can’t imagine that alone would be enough to fully account for this.

In: 565

22 Answers

Anonymous 0 Comments

It’s mostly the electronic chips at each end of the cable that have gotten better, not the cable itself.

Anonymous 0 Comments

This is the same as ethernet cat5/cat5e/cat6 cables.

A cables jacket can only say what specificstion it was tested at. Not what future options exist. There are lots of tiny adjustments that are made over time that do make differences to the upper performance level, wire guage, plug connectors, twisting etc.

A well made over engineered old cable may well meet new specifications if tested, a cheaply made enough to pass may not.

Anonymous 0 Comments

I used to make big fun of Monster cables but ever since I have a 4K HDR tv and a 4K Apple TV I had big issue with my old cables.

All problem went away with new shielded cables.

Still not buying Monster 😀

Anonymous 0 Comments

HDMI uses digital signaling which is essentially switching between high and low voltage very quickly in certain patterns to resent things like..which color and how bright a pixel is.

All wires have properties like impedance and capacitance and the impact those properties have on a signal increases the longer the cable is and the higher the frequency of the signal is.

Impedance causes the signal level drop and capacitance causes the transitions between high and low to slow down and smooth out instead of being sharp. If these problems get bad enough the receiving circuit can’t make sense of the signal anymore.

So if we want to increase the amount of data an HDMI cable can carry, we need to increase the frequency that we send the signal (more data per second) but if we do that we’ll run into these issues.

So in order to make faster HDMI cables we need to make higher quality cables that have lower impedance and lower capacitance.

Anonymous 0 Comments

Remember HDMI versions refer to the equipment, not the cables, there are no such things as HDMI 1.4 cables etc, etc.

Anonymous 0 Comments

They are not the same, there’s variation on the thickness of the cable, shielding and materials so we have various standards that define the maximum bandwidth a given cable supports so while a 2.1 cable has the same amount of wires as a 1.4 they are constructed differently and the 2.1 has better quality. All of this assuming said cables actually comply with the standard.

Anonymous 0 Comments

To take things back a few years: When I was taking electrical engineering in the mid 1980’s, we were taught that due to bandwidth of phone lines, there was a physical limit to the data speed through a phone line of 1200 baud.

The modems they had in the lab were over $1000 each.

Within a few years they had discovered how to work around the physics and pass 56 kb/s through unchanged phone lines, and a few years later they were using the same phone lines for DSL at 10 Mb/s.

Even though the physics don’t change, we find ways around things!

Anonymous 0 Comments

Piggybacking on the OP, is there a specific reason HDMI cables are usually more stiffer and less malleable than other cables?

Anonymous 0 Comments

The quickest and easiest explanation of this is that basically the end points of those wires keep improving to be able to send and receive more data at the same time. It’s the same reason the cable internet that comcast put in your neighborhood in the 90s that ran at 1.2mbps can now do 1gbps with the same wire. Honestly it’s cool what we come up with.

Anonymous 0 Comments

Think of it like increasing the amount of written information on a line of college ruled notebook paper. The number of lines on that paper stays the same, that’s the number of wires In the cable.

Your hand is a sender
The pen is the cable
And the paper is a receiver

(In reality, both devices send and receive and all the lines are being written on simultaneously but try not to waste a visualization on that)

Another part of receiving device that we’ll call “the reader” takes the paper and makes sense of what was written, it then relays this to the computer.

In terms of computer signals, adding more letters to the alphabet you’re using is extremely effective as you exponentially increase the amount of information it can contain. The problem with that is the letters start looking similar to others so you have to be more accurate when writing to make sure it can be read properly. The letters are the individually recognizable signals coming from each wire.

There IS a limit to how small the pen can write before the ink bleeds enough to make it hard to read(that’s the cable) but that’s usually not the bottleneck since cables are pretty simple.

There’s a limit to how fast/accurate you can move your hand and how much the texture of the paper will impede those finer hand movements, that’s the hardware that gets upgraded the most over time as computer technology advances.

Now as these components get better and better, we’d stay the same, still running on HDMI 1.4, the same alphabet and writing at the same max speeds so no real change. That’s because we are following the last set of instructions we were given, software. Everything has to be in place before we can make use of ANY of the benefits. We can’t just speed up just because we are capable, and we definitely can’t start using new letters out of the blue and expect the computer to understand, the computer will look at it and go, “What? Wait… too fast, and WTF are Wingdings, nope, denied, I’m not accepting that, HDMI slot, whatever you’re doing, I refuse to cooperate, go back to what you were doing before”.

That’s where the software upgrade comes in, HDMI 2.1 software tells the the computer how to make sense of the new letters, to expect paper drops to happen more quickly, and it goes “okay, I understand now, what’s this called.. 2.4? Okay, I’ll herby allow all the 2.4 capable thingies connected to me to talk to me now”. And if everything else is in place, that’s what happens, the hand and paper are allowed to do their new thing.

The alphabet bit might be a little confusing since computers use various alphabets in code but in this instance, there’s a range of electrical signal available (let’s just use 0-10volts though I’m not sure what it actually is) and we have to divide that range into a set of recognizable readings (like 0-1volts,1-2volts,3-4… *totaling 10*) there has to be space between because there’s a lot of things that can make an initial 4v signal be received on the other side with a change, maybe it’ll be read at 3.4v instead. As components get more accurate, that potential variance gets smaller, maybe new designs are accurate to 0.5v or something and we then gain the opportunity to divide that range into smaller chunks with an upgrade without the risk of overlap(adding more “letters”, 0-0.5v, 0.5-1v -1.5… *totaling 20*). The key is timing the upgrade to make the best use of all the advancements without waiting too long between or jumping the gun before you could have had something better. It’s a process though because, like I said, everything needs to be in place before you reap any benefits so a large investment went into HDMI 2.1.

There’s a lot more going on with hardware, it’s not just “more innate accuracy” that new software makes use of, new chips are designed for the impending change and are installed in parallel with a chip that reads it the old way(maybe one that does both) and the data gets routed through whichever data method is relevant, determined by a digital handshake that verifies the type.