Shouldn’t any modulation produce at least 1 bit per hz?
With HSPA the math made sense, but original UMTS, I don’t understand how it only supported 384 kbps with a 5 MHz channel.
With GSM, it was close to 1:1, 200 KHz = 270 kbps. And in that case I actually don’t understand how the symbol rate exceeded the frequency. So if you happen to know that answer as well, that would be amazing.
In: 2
GSM’s carrier frequency is 890-960 MHz, *far* higher than the 270 kpbs transmission rate. You’re thinking of the channel *spacing*, that was 200 kHz. So adjacent channels are 200 kHz different but their actual carrier frequency is about 4000 times higher.
No modulation gets 1 bit per Hz…the actual broadcast frequency is called the “carrier frequency”, that’s what the radio tunes to. That’s modulated to transmit the data; to avoid something called “aliasing” (basically, a distortion of the data stream due to not sampling it fast enough) you can’t accurately transmit a datastream with more than half the carrier frequency. So if your carrier frequency is 500 MHz the maximum theoretical data rate is 250 MHz.
But that’s not the same as what you can reliably transmit; radio is a noisy transmission medium. Modern modulations use a ton of clever tricks to reduce the error rate and ensure reliable transmission over variable/flaky/interfering/reflecting channels…but that all comes at the cost of transmitting extra bits beyond the actual data. In some cases, a lot of extra bits. So the practical data rate is waaaaay lower than the theoretical one.
384 kbps is the maximum user bitrate but a 5 MHz UMTS carrier supports 20 channels so the whole carrier max bitrate is 7.68 Mbps. Moreover UMTS allows each adjacent tower to broadcast on the same frequencies whereas GSM signal must be broadcast on non-overlapping frequencies on adjacent towers. In densely populated areas a GSM tower typically broadcast on 1/7th of the available frequencies.
UMTS broadcasting on the same frequencies across all towers is not without penalty. If you are in the middle between two towers you’d never reach peak spectral efficiency but overall the penalty is not as bad as the 1/7th GSM frequency use penalty.
> 200 KHz = 270 kbps. And in that case I actually don’t understand how the symbol rate exceeded the frequency.
Firstly, 200 KHz is not frequency but bandwidth. Secondly, the symbol rate is not that important. Remember that the bitrate depends on signal-to-noise ratio. See [Shannon–Hartley theorem](https://en.wikipedia.org/wiki/Shannon%E2%80%93Hartley_theorem). You could increase the GSM symbol rate by simply decreasing symbol duration but that would cause the signal-to-noise ratio to decrease and the bitrate you can push through woudn’t change.
Latest Answers