If all HDMI cables are basically the same design, pinout, etc. how have they been able to double, quadruple, etc. the bandwidth on them over time?

801 views

Going from HDMI 1.4 to 2.1 there is a 5x increase is bandwidth. Is it because the cables themselves were never the issue but it was the connectors/chips in the devices themselves that couldn’t handle it?

I know part of it is the actual quality of the cables themselves and tighter tolerances, more twists in the wires, material purity, etc. but I can’t imagine that alone would be enough to fully account for this.

In: 565

22 Answers

Anonymous 0 Comments

There are actually four different standards for HDMI cables. Standard, high speed, premium and ultra. The 5x increase in speed you mention is the difference between high speed cables and ultra high speed cables. In addition to improved cables there is also much improved hardware at both ends to take advantage of the better signal quality. So even if you had ultra high speed cables ten years ago you could not get hardware for HDMI ports to handle those speeds. There were some professional broadcasting standards which were able to do this but that was outside the price range of any consumer.

Anonymous 0 Comments

Maybe not in this exact case but the way I imagine more data going through the same cable is like audio. The basic technique is to put many streams of data on the same wire at the same time.

My analogy is audio, if you were to “listen” to the slow data rate you would only hear one tone but for a higher data rate you would hear two tones at the same time or more. As the years advance so does the hardware and it’s ability to pick out the different tones and separate them out into the data they’re actually carrying.

Anonymous 0 Comments

The HDMI specification defines the pinout of the connector. There are multiple datalines that are defined.

Older specifications didn’t use all these datalines for communication and these are not necessarily connected when using an older specification cable.

Newer HDMI specification do use these additional datalines and hence more bandwidth is available for the video signal.

Anonymous 0 Comments

The point is that any cable capable of supporting 2.1 will be as good as any other cable that does so as well, provided neither are outright defective. So look for what version of HDMI the cable is built to support and then pick based on length requirements, color or material (for aesthetics, not expecting better image or sound quality). Overpriced cables that promise better video or audio thanks to their unnecessarily thickness or gold connectors are a scam.

Anonymous 0 Comments

There’s a lot going on here, and the explanations go way deeper than a 5 year old would understand. But, I’ll try to explain.

Computers work in 1s and 0s. They see electricity as either on or off. In order for the computer in your TV to understand what’s happening, it needs to have the signal in 1s and 0s.

Wires don’t work in 1s and 0s. Wires work in a (practically) continuous domain. They can be at 7 volts, 4 volts, 0 volts etc.

All data transmission standards have to solve this problem of turning a continuous space into a digital (discrete) space. How do we do that?

We build maps. A simple one would be to copy the same one we use in the computer. If there’s ANY electricity in the wire, read it as a 1. If there’s none, make a zero. (More accurately, computers say anything above x voltage is a 1, and anything below is a zero.)

That’s a pretty inefficient mapping, we can communicate a lot more information if we use more voltage levels. If we have a transmitter that has a maximum output of 10 volts, we can for example make 4 voltage levels. 0-2.5V is a 00, 2.51-5.00V is a 01, 5.01-7.50V is a 10, 7.5V+ is 11. By simply changing how we interpret the information on the wire, we just doubled how many bits we sent.

As the gaps between these levels get smaller and smaller, it gets harder and harder to delineate between the different levels. This is where key innovations have happened over the past 20 years. We have developed denser mappings, and have developed techniques that allow us to read those denser mappings. As a field, this is known as digital signal processing, and it has gotten orders of magnitude more advanced since the year 2000.

Understanding why reading these mappings is more challenging than a simple “look at the map” is best done with an electrical engineering degree. The short of it is, Electricity changes when it gets sent in wires, and those changes will muddy the waters. It can show up too late, too early, not at the correct level, all of which place limitations on how fast you can send data. We have gotten really good at predicting and designing around this fuzzy nature of electricity, and thus can send data much faster over the same medium.

Anonymous 0 Comments

Im just annoyed they all look the same, all say they are super duper high speed, and so I don’t know if I’m using the right one.

Anonymous 0 Comments

The same reason that modern cars can go faster on a road built in the standards of 1923 than 1923 cars could…because modern cars are faster, and the road isn’t limiting them.

Or a more apt comparison: cell phones keep getting faster, but the air they transmit through is the same. It’s because the phones and the base stations are improving, and any limitations of the medium are either not limiting, or they’re able to work around them.

How do they do this? Well, returning to the car analogy, imagine you want to increase throughput on a highway, but you just can’t make the road any bigger. And you can’t make the cars faster, because they’re already moving at the speed of light. What do you? Maybe make the cars smaller, so you can fit more lanes in. Or stuff more people in each car, so more people get to their destination per unit time. Maybe even make it so cars can drive on top of each other–a three-dimensional road can hold a multiple of what a two-dimensional one can.

Anonymous 0 Comments

Okay, not specific to HDMI, but more communications in general: the sending and receiving side need to agree in advance on the amount of data they’re going to exchange (a specific standard is that agreement). You’ll commonly hear a pipe analogy, but in that analogy, the pipe (i.e. the phyical cable) is oversized, and the limiting factor is that the hardware on either side of the pipe is only set up to handle a certain rate of fluid. Lots of work goes into making sure that the pipe itself doesn’t really matter.

A less ELI5 answer would get into the tech used to support higher bandwidths on crappy cables (i.e. pre-emphasis and post-equalization), or into higher-order modulation schemes which allow more data to be put into a given bandwidth. Again, I’m not familiar with the ins and outs of the HDMI standard, so I don’t know exactly what tricks they’re using. But comms problems are comms problems everywhere.

Anonymous 0 Comments

Easiest way to explain it is new standards. A newer standard i.e. 2.1 requires a set of build designs into the cable that enable the higher speeds such as better connectors and better quality wire. Even though it’s the same connector, the cable standard improved that much as to allow like 8k signals to go thru. It is the same concept why you probably won’t find a 20 ft HDMI specced 2.1 cable, it wouldn’t pass the standards.

Anonymous 0 Comments

Imagine a small stretch of road.
Only one car can travel on it at a time.

Now you have the technology and split the lanes so you have a car going both direction at the same time. Just one car per lane.

Now you have the tech and made it so you can multiple cars on the road at the same time per lane.

The road is always the same, you have just learnt to utilise it more efficiently.
Now apply the same concept to cables.