[ELI5] For wifi why do we use a whole channel of 20MHz instead of a couple frequencies?

321 views

So I get why we can’t use ONE single frequency, it’s sine and we can’t transfer data through it, it’s always gonna be constant. But what about using like 10 or 20 frequencies, something like from 2.4GHz to 2.4GHz10Hz (wrong syntax but you get what I mean). and use it to transfer the data.

I’m gonna give an other potentially wrong example, for radio, don’t we use one single frequency? we set our receiver to that frequency and get the transmitted data from the transmitter? or is it different from wifi because radio is simplex and wifi is half-duplex?

This ended up requiring 2 explanations (wifi and radio) but I’m more interested in the wifi use of a whole 20MHz channel instead of a smaller number of frequencies. Thanks.

In: 2

5 Answers

Anonymous 0 Comments

We do, the 2.4GHz band is centered at 2.4GHz with 20MHz of bandwidth, so it’s actually 2.390 to 2.410 GHz.

The reason you need a range of frequencies is because the channel capacity (max number of bits per second you can send or receive) is proportional to the bandwidth. If the bandwidth is 0 then the amount of information you can send is 0 bits per second. If you want something to Google, it’s called the Shannon-Hartley limit and has some interest math to prove it.

It’s actually the same for radio. The key insight behind radio is that what you’re doing is taking a low bandwidth baseband signal and moving it to a “channel” at a higher base frequency so it doesn’t interfere with another signal with the same bandwidth. Another trick you can pull is to time the signal so it is interleaved in time with other signals, but this is ultimately the same thing viewed at a different angle (pun intended, for the Fourier nerds)

You are viewing 1 out of 5 answers, click here to view all answers.