Why is 2.4Ghz Wifi NOT hard-limited to channels 1, 6 and 11? Wifi interference from overlapping adjacent channels is worse than same channel interference. Channels 1, 6, and 11 are the only ones that don’t overlap with each other. Shouldn’t all modems be only allowed to use 1, 6 or 11?

528 views

[ad_1]

Edit: Wireless Access Points, not Modems

I read some time ago that overlapping interference is a lot worse so all modems should use either 1, 6, or 11. But I see a lot of modems in my neighbourhood using all the channels from 1-11, causing an overlapping nightmare. Why do modem manufacturers allow overlapping to happen in the first place?

Edit: To clarify my question, some countries allow use of all channels and some don’t. This means some countries’ optimal channels are 1, 5, 9, 13, while other countries’ optimal channels are 1, 6, 11. Whichever the case, in those specific countries, all modems manufactured should be hard limited to use those optimal channels only. But modems can use any channel and cause overlapping interference. I just don’t understand why modems manufacturers allow overlapping to happen in the first place. The manufacturers, of all people, should know that overlapping is worse than same channel interference…

To add a scenario, in a street of houses closely placed, it would be ideal for modems to use 1, 6, 11. So the first house on the street use channel 1, second house over use channel 6, next house over use channel 11, next house use channel 1, and so on. But somewhere in between house channel 1 and 6, someone uses channel 3. This introduces overlapping interference for all the 3 houses that use channels 1, 3, 6. In this case, the modem manufacturer should hard limit the modems to only use 1, 6, 11 to prevent this overlapping to happen in the first place. But they are manufactured to be able to use any channel and cause the overlap to happen. Why? This is what I am most confused about.

In: Technology
[ad_2]

Can someone explain this question like I’m five?

Almost touches on the idea of a prisoner’s dilemma -like situation.

The standard allows the choice of any channel in the range to best suit the user’s wishes. But let’s just say everyone sticks to 1/6/11 and those three bands are heavily congested. Anyone setting up a new radio in a congested area will find a LOT of interference centered around each of those three channels.

Someone else gets tired and decides that to avoid interference he should select something in the middle of the overlapping bands like channel 3. And suddenly now you have someone who has a relatively clear channel, but now 1 and 6 have some interference from another channel in addition to everyone else already on 1/6.

Other WiFi devices aren’t the only thing that you might need to work around, it could be other 2.4ghz devices as well as environmental factors.

because with channels 1, 5, 9, 13 you get 4 non overlapping channels. not all countries allow all channels.

Because in much of the world world, you should be using 1, 5, 9 and 13, to get 4 non overlapping options. The 1-6-11 is used because of the U.S. refuses to allow use of channel 13, or, in Japan, to allow channel 14 to be fully non-overlapping.

In addition, there are uses for half-overlapping channels. When a large area needs to be covered, you have 3 non-overlapping channels nearby, and further away you use half-overlapping channels, where the weak overlapping signals won’t cause as great a problem.

so if i enable only those channels i will benefit from it because other people use the standard settings ? if so, how do i do that ? thanks

Because that isn’t how wireless signal works.

https://cdn.comparitech.com/wp-content/uploads/2018/03/16-2.4G-Channels-1024×587-1024×587.jpg

The farther you are from the center of another network’s broadcast, the less noise that network causes. If you have 9 networks in range, you will do better with partial overlaps on 4 of the other networks, than you will with complete overlap with 2 other networks.

Two reasons: edge cases where it does make sense to deploy on one of the normally overlapping channels (think single AP deployments in odd RF environments), or other countries where you’re allowed to go up to channel 13.

Cause number 1: Freedom.

Number 2: FCC actually had a rule saying you can only use 1,6, and 11. But no one had to follow it, and it’s left open to use whatever channel you want because there is/was anticipation to use wider band channels (40mhz over 20mhz for OFDM). Which you can see in the wild if you have a scanning tool. If you’re curious, you can get Alfa Wifi Scanner software and take a look at the different channels are being used in your area and what their bandwidth is. From that, you can also choose a better channel for you personal device.

This is a really smart question for a five year old.

I know it’s just semantics but you’re referring to wireless access points (WAP). Not all modems perform as routers and access points.

All these answers and not a single person has stumbled on the correct one: Hindsight is 20/20.

Remember that when the standard was settled upon, the designers had absolutely no idea how ubiquitous WiFi would become. It would be approximately another ten years before WiFi routers would even start to become household appliances. Zip drives were state-of-the-art, laptop thickness was measured in _inches_, and the concept of a smartphone was about a decade away from public consciousness. People rented VHS cassettes to watch movies at home on their rear-projection TVs, and HD television was for the idle rich. Netflix had just started mailing people DVDs via The Postal Service.

Okay, I’m getting a little carried away describing the world of the late 90s, but it’s important to remember the designers of the 802.11 standards had to make choices in a world where households rich enough to even have internet access connected to the internet via dialup. No one even conceptualized a world where routers would be so cheap that every single tenant in an apartment building would have their own radio transmitter sitting in a closet gathering dust out of sight, out of mind. Many of the choices they made for the standard naturally assumed wireless internet access would only really be deployed by professional network admins who would have control of all the other routers in range. Why not let them choose any channel?

Also why the hell when I choose auto channel selection the router chooses the WORST channel and basically never chooses 1,6,11?

The decision to restrict 802.11 to a 72Hz range was a stupid one to begin with. The overlapping channels thing is a mitigation of that. I can imagine the standard changing to what you described, and it would help for some people but it’s just mitigation on top of mitigation of a screw-up that needs to be fixed directly. Also, restricting to those three channels reduces flexibility for people who need to fine-tune things to avoid interference from things other than WiFi.

I’ll add here that modems (as they’re used in the Internet access sense) are not WiFi devices (however, WiFi must necessarily have modulation and demodulation as part of the signal path).

The box your ISP provided that you’re referring to as a modem is actually several devices inside a single box:

the modem that connects to your ISP’s circuit allows interfacing between the ISP’s circuit and the router, as they have different physical layer connections.

The router operates at the network layer and its job is to move Internet traffic between your network and your ISP’s network.

The router then connects to an Ethernet switch, which is the foundation of your local area network, or LAN. In addition to a couple of internal ports for the router and the access point (more on that in a second), it also usually has a handful of external ports to connect various network devices.

Then you have your Wi-Fi Access Point – this acts as a bridge that translates the data link layer between Wi-Fi and Ethernet. In a wireless network, the “physical” layer consists of radio waves (I know, they don’t seem very physical, but physics play a huge part here).

Networking is generally referred to in “layers” like a burrito. A theoretical model is the ISO model, which is 7 layers. The TCP/IP model is more practical and consists of 4 layers. Each layer fits inside the data payload of the layer below it.

The Physical Layer (1) consists of bits – 1s and 0s. This can be electrical signals on a wire, electromagnetic radio waves (wireless), or electromagnetic pulses of light (optical). This could even be smoke signals or acoustic waves if you got crazy enough. This is the tortilla – it holds the burrito together.

The Data Link Layer (2) adds structure to those 1s and 0s by defining how a link carries data. This can be Ethernet, Bluetooth, Wi-Fi, or a variety of other ways of transporting data.

layers 1/2 in the ISO model correspond to the single Network Access layer in the TCP/IP model.

The Network layer (3) is where interesting stuff starts to happen – this defines specific ways devices on the network talk to each other. This is where IP lives. TCP/IP calls this the Internet Layer.

layers 4-7 deal with the actual user data being sent over the network. This is stuff like HTTP and all the other stuff you do on the internet. TCP/IP model calls this the Application Layer.

So your internet traffic operates at layer 3, and between you and the server, it goes over a whole variety of layer 1 and layer 2 connections to get there.

I believe it has to do with WiFi using a spectrum band that is known as ISM, or “Industrial, Scientific, and Medicine”. It is unlicensed so it costs no money to use it (unlike frequencies for Cellular use for band example). The caveat is that it limits significantly the power you can transmit, an if you chose to use it then you must be ok to the presence of other users there (since it’s unlicensed). Basically the FCC left these bands for users to behave nice – you can use it for certain applications that don’t need to send data very far, or for research, but act nicely; don’t try to overpower anyone else nearby and don’t whine if someone interferes with your signal within reason.

These rules were written before WiFi ever existed, and wifi operators started using these frequencies too; they’re so prevalent now that the band is occupied heavily and you experience interference.

The real answer is that the ISM band where wifi operates existed long before wifi existed. The band was split into narrow channels suitable for the various equipment used in it at the time. Wifi needed wider channels.

Wifi channel 6 is centered on ISM channel 6, but occupies ISM channels 4 through 8. The three wifi channels fully occupy the US ISM band.

Different countries have different ISM bands. The channels line up across countries, but the top and bottom channels are not the same. The wifi scheme in these other countries centers on different channels, with the intention of fully using the available band.

The final piece of the puzzle is that manufacturers want to produce one device to serve multiple markets. Which means they have to be able to operate on that market’s scheme.

A modem is a device that converts analog signals to digital, and vice versa. APs are not modems. Switches are not modems. Routers are not modems. Hubs are not modems. Not all Router APs that have coax are modems. Sorry if I come of a dick but stop saying that.

Op your analysis is essentially correct, the extra channels exist for 2 cases. One, you live isolated and can use wider bandwidth (20MHz/40MHz/80MHz/160MHz). The other case is for countries where some of the spectrum is already used. The channels available in other countries vary. For legal reasons, APs have different firmwares in other countries to avoid causing interference on gov spectrum. Japan is the only country with Channel 12 and only for 802.12b. You can use custom firmware to use illegal channels if you are isolated enough to not cause issues. In the US you can broadcast 200ft on gov bands if you want.

Typically co-channel interference is preferable to adjacent Channel interference. With a big exception.

Imagine you’re at a big family reunion. I’m talking massive. Like Great Grandpa got back from the war and had 20 kids between 3 wives. And each of those kids and theirs kept up the tradition. Now what if they lived in a country with an awesome medical system that guaranteed a long life span and everyone showed up to the family reunion.

Now the event planner(cause 804 people goddamn) came up with 3 incredibly long tables. And to keep things organized each table had 1 golden token. Whoever held the golden token could talk 1 thing and could take as long as they needed.

-This is how WiFi routers handle co-channel interference. Any router within range(above -80db signal strength) and on the same channel would recognize each other and pass a token to decide who is talking.

For most it was fine they would get the token and would say rather mundane things like,”pass the salt”. But Great Grandpa that glorious mothertrucker would share his war stories and everyone else would be stuck until he was done or forget what he was talking about.

– So not only do you have a whole lot of people waiting to talk but Grandpa takes extra time when it’s his turn. When you have an apartment complex with everyone on those same channels, everyone ends up waiting. Since 2.4ghz is the oldest standard, you could have computers upto 20 years old trying to use the same channels and they take a long time when it’s their turn.

Now some people get a little impatient and get up and stand between the tables to hold their own conversations. It makes it a little noisier but it speeds up conversation at the main tables and people can catch up.

-This is were adjacent channels are useful. It makes it a little noisier as they’re talking over the other channels but it allows the different routers put more tokens in play. It makes the error correction have to work a little harder but data is passed faster.

Now if too many people get up it descends into chaos and nobody can hear anyone. So your busy body aunty starts going to these little groups and tells them to sit down. Now you figure instead of sitting at the same table as Grandpa you’ll swap tables.

-You’re experiencing bad WiFi because everyone has set custom channels so you call your ISP. The ISP sends out a Tech and they factory reset your router and it defaults back to 1,6, or 11. They tell you next time it gets slow to reboot it again. This forces your router to jump to the least congested of 1,6, or 11.

It’s getting a little late and the bar opens up so all the 20-30 somethings jump over there. There’s a whole lot of small tables and conversations are going smooth.

-5ghz has now come into play. Whole lot more channels and the devices aren’t nearly as old. Less congestion and better bandwidth.

Now an alternative to jumping channels would be what I call the “good neighbor policy”. Every router in an apartment complex cuts their transmission strength to 65%. This way the routers don’t see as many networks and reduces the number that share the token. The problem with this is twofold. 1. Everyone has to be in on this, so all the ISPs and customers on site have to agree to reduce their signal strength. 2. Large apartment buildings have a lot of concrete. Concrete kills WiFi, and if you’re reducing your transmission strength, you’re increasing the likelihood of dead spots in your WiFi.

When it comes to WiFi there’s no winning.