How can old Ethernet cables can handle transmitting the data needed for 4K 60hz video, but we need new HDMI 2.1 cables to do the same thing?

191 viewsOtherTechnology

How can old Ethernet cables can handle transmitting the data needed for 4K 60hz video, but we need new HDMI 2.1 cables to do the same thing?

In: Technology

6 Answers

Anonymous 0 Comments

The data on the ethernet cable is compressed. The amount of data going through is typically less than 30 megabits. Wifi in 2010 was better than. The image quality is obviously reduced, but it’s been designed to try to minimize how much your eyes notice it. The average “old” network cable these days is probably cat5e which is good up to 2.5 gigabit. Your CPU or GPU must decompress the data.

HDMI data is not compressed. Each pixel’s full Red, Green and Blue colours are being sent, one by one, around 8.3 million pixels per frame, and 60 frames per second. Plus the audio data, and the option for additional information like signalling power on/off of the device. The sheer quantity of data requires the cable be designed for quality and that often limits its length as well, since electricity degrades over long distances on wires while causing the wire to warm up a bit.

That’s why. The amount of data is actually different due to compression. Otherwise even 1080p youtube would require gigabit internet connection speeds.

Anonymous 0 Comments

Because HDMI is uncompressed video, while streaming video is compressed. Better compression algorithms and faster chips have made modern video and audio compression very effective.

Anonymous 0 Comments

HDMI 2.1 have a bit rate of 48Gbit/s. Most ethernet cables are used at 1Gbit/s but you do get them to 10Gbit/s. I have worked with 50Gbit/s and even 100Gbit/s ethernet cables but that is not of the type you get in a normal shop. So you are right that there are something extra going on.

Whenever you see 4K 60Hz video being transferred over an ethernet cable it is always compressed. With only a small caveat HDMI is not compressed at all. The reason HDMI is so high bandwidth is so that there is no need for any compression. If you had done compression then it would make the video quality worse and it would increase the cost of the monitor and the video output systems by quite a lot. When the cables are so short as HDMI cables are it is much better and cheaper to just have better quality cables and higher bandwidths then to compress the video signal.

Anonymous 0 Comments

Because when you stream/download a video over ethernet, you are downloading compressed data and buffering it before you play.

HDMI 2.0 is 18 times faster than Cat-6 ethernet cables, and HDMI 2.1 is 48 times faster than Cat-6. Both can stream live data from your video player much, much faster than ethernet can; HDMI 2.1 can do 4K 120Hz video. A 1 Gbps ethernet cable can do 4K **5Hz** video (if that video is uncompressed and not buffered).

Anonymous 0 Comments

You’re probably thinking of compressed video.

To get 4k60, HDMI 2.1 has 18Gbps of bandwidth. Regular old ethernet cable and network cards generally max out at 10Gbps, which is the same as HDMI 1.4. Not enough for uncompressed 4k60.

But video over a network is almost always compressed, which substantially reduces the amount of bandwidth it needs. That’s probably the 4k60 over ethernet you are thinking of. But that adds compression artifacts and latency, which you don’t want for a monitor output. So HDMI uses uncompressed video.

Now ethernet can go above 10Gbps, but the network cards get very expensive very quickly. To reach 25Gbps you need to spend $350. So if you wanted to replace HDMI with ethernet cables you would need to add $350 to the cost of your TV/Monitor and $350 to the cost of every device as well.

That’s why you have HDMI instead, it’d designed to be both high bandwidth and cheap.

Anonymous 0 Comments

Watch out extra long 2.1 cables degrade the signal.. becomes intermittant or snowey. you are best off getting a 8k lead.. 2.2