Something that has confused me for a while. I’ll try my best to explain my question:
Let’s say any type of cable, coax, ethernet, even fiber. They all are capable of transmitting multiple amounts of data on different frequencies.
As an example, let’s say cable TV, using coax. The cable can transmit VHF and UHF frequency spectrums. And I understand that to receive a “channel” (at least in the old analog electronics) it was simply done with a variable capacitor that was tuned to block all the unwanted frequencies, except the one you wanted to see.
So where my question comes in is: what’s actually being transmitted by the cable? If we have 50 channels, all on different parts of the spectrum, is it just one huge waveform with a total frequency and amplitude? And since “frequency” is just the number of voltage changes per second, and “amplitude” is just the strength of the voltages, the “single” waveform necessary to transmit the “bulk” of data is going to be huge!! and super dirty!! (don’t these waves have so many harmonics that interfere with themselves and might damage some of the data they are carrying? So assuming it’s one huge frequency carrying everything, and a “filter” at the end, how the heck does it handle the self-interference problem?
I actually understand the Fiber Cable a bit better. Same concept but with light. We can have DWDM technology to “bundle” a bunch of frequencies of IR light (usually in the 1200nm – 1600nm range), and they all run thru the same cable. But the origin needs to have an SFP for each individual frequency, so you can easily end up with 8 emitters, and your DWDM device at the other side essentially acting like a prism. And light tends to not interfere with itself, so it’s happy to send multiple IR signal on the same fiber, and easily split at the other end.
It’s really just the electrically based cables that baffle me.