Why are we forced to use HDMI or Coax for high def tv

350 views

I have had both cable and satellite before. In both cases a coax is run from the source to box that digitally decodes the signal, meaning that coax is transmitting a lot of data, often including high speed internet. Yet running a coax from the cable box to the tv only results in standard def picture. What gives?

In: 12

9 Answers

Anonymous 0 Comments

That depends on what the TV has built in too. The TV might come with a tuner and possibly allow”HD” over coax, but be 720p. Let the boxes handle the work and run HDMI to tv, as that’s the most compatible. Especially since that’s an 18Gb/s cable.

Yes, coax technically brings it to the box, but that’s also bc most homes already have the wiring for it. Instead of forcing everyone to redo their homes, or making technicians do now work and then charge more, at that point nobody would adapt/adopt it.

Anonymous 0 Comments

The engineering required to get the high bandwidth out of old coaxial networks are quite astounding. The coax decoder is full of advanced modern technology to achieve this. If you wanted to use the same technology to send the images to your TV then you would require an equally impressive encoder as well. And you would depend on your TV coming with a decoder which is compatible with the type of signals you generate. So while it is technically possible, any many TVs do come with decoders for some types of coax signals, it would be expensive and unnecessary. It is much cheaper and simpler to just upgrade the cable to a high quality HDMI cable which does not require the same encoders and decoders to get the same bandwidth out of it.

Anonymous 0 Comments

It’s not the available bandwidth… it’s the security… HD content is almost always encrypted and while the standard (https://en.m.wikipedia.org/wiki/High-bandwidth_Digital_Content_Protection) supports multiple cabling options, adding ports adds cost, which TV makers tend to want to avoid.

Anonymous 0 Comments

Coax is not HD compatible. When transporting a modulated signal, it’s fine. But when it comes time to “decode” that and display it in an HD format that the TV can understand, coax isn’t suited for that. The box is what decodes it and recombines it (it’s in pieces when it gets there to speed up transmission) and even sends it through the output to the TV. If that output is coax, there’s signal degradation.

In other words, prior to hitting the box, it’s not a TV signal at all

Anonymous 0 Comments

Why are you “forced”? Because HDMI also uses HDCP end-to-end encryption for copy protection.

The program is only allowed to be output in standard definition when in an analog format more easily captured and recorded.

Anonymous 0 Comments

when you get tv and internet in the same cable, the available bandwidth is shared between two digital signals, one for internet and one for tv. even sometimes is a digital signal for internet and an analog tv signal that in itself contains many channels. that’s called multiplexing.

the tv box filters the TV signal and uses it, while the internet gateway (the router/modem) takes the internet signal.

when going straight from the tv box to the tv’s digital input, only one channel is being transmitted at full bandwidth, allowing for more information to be transferred at once in one cable, instead of a multiplexed signal with many channels sharing bandwidth being decoded inside your tv.
when the latter happens, you separate the tv signal in the box, but still share the bandwidth of the cable with multiple channels and thus resolution has to be lower (no hd)

as for HDMI being a standard in high speed video transfer protocols, it’s easier and more ubiquitous to use in modern devices for the same task

Anonymous 0 Comments

* Coax *could* be used as the connection between the cable box and the TV.
* But the movie industry wanted something different for a lot of reasons:
* Less confusion
* the coax coming from the street into your house *must* run through the decoder box from your specific cable company before going to your TV.
* If both connections used coax, people would try to connect it straight to the TV and it wouldn’t work.
* Coax comes in many different types and not all are suitable for the immense amount of data HDMI cables transmit.
* The cable from the street is installed by the cable company so they know it’s the proper type, but a cable from the decoder to the TV could get swapped out by the user by mistake and now there are complaints that things don’t work.
* HDMI cables have a minimum spec that means they will all work….for now.
* If your device has an HDMI connector, then you know it works with HDMI.
* Since lots of formats have used a COAX connector…you can’t tell just by looking at the device if it’s compatible with HD video, HDCP, etc
* Slimmer form factor/easier to connect – HDMI connections don’t take up as much room on a device as COAX connectors do and they are much easier to insert/remove and harder to break than COAX.
* More control of design: To sell a device with HDMI connectors with the logo you need to get the proper licensing and that ensures (most) manufactures follow the rules.
* Anyone can make a device with a COAX connector and imply it does what it’s supposed to (like enforce HDCP) but it might not actually do that.

Anonymous 0 Comments

Televisions these days are digital. Coax has the bandwidth, but HDMI is digital and coax is analog. Since you’ve already done the analog to digital conversion in the receiver box, there’s no good reason to convert it back to analog just to send it to the television to get converted back to digital again. Instead, use a digital transport like HDMI.

Old televisions and monitors were natively analog, so using an analog connector like coax or VGA made more sense.

Anonymous 0 Comments

Coax from pole to your set top box carries a DIGITAL signal, usually in the 2-3 gigahertz frequency range. (This is printed on the cable sheath)

Regular RG59 TV coax is only shielded for 27 megahertz and carries an ANALOG signal. Hi Def requires greater bandwidth than that.