Eli5:Why is it that cell phones , with their tiny antenaes are able to download files at LTE speeds inside concrete buildings from cell towers kilometers away yet getting wifi to work behind a corner is an exercise in frustration?

452 views

Eli5:Why is it that cell phones , with their tiny antenaes are able to download files at LTE speeds inside concrete buildings from cell towers kilometers away yet getting wifi to work behind a corner is an exercise in frustration?

In: 2387

23 Answers

Anonymous 0 Comments

It is not that hard to get wifi to work over several kilometers either, just as long as there is no corners. Actually the longer distances allow the signals to go parallel to the walls so they can go through the least amount of concrete and rebar as possible. In addition to this most taller buildings have the majority of concrete and rebar in its center columns thus allowing signals through the outer walls but not from one side of the building to the next. If you walk around the corner of the building you are probably no longer connected to the same cell tower. People do not notice this because there are way more cell towers then most are aware of, they hide them well. While a typical office building need about four wifi access points per floor to provide good access it typically connects to over 10 cell towers within half a kilometer radius. So no matter where you are in relation to the columns of the building you are likely in sight of one of them.

Anonymous 0 Comments

Because they are working on two very different technologies.

They use different waves of radio frequencies. Cell Phones operate in terms of Mega Hertz, Wifi Operates in terms of Giga Hertz.

Since wifi uses a much higher frequency, when you have a good wifi connection you can transfer way more information much faster than cellular is capable of.

The drawback is that because wifi uses this much faster frequency, it can not travel as far and is blocked relatively easily.

Cellular systems operate at a lower frequency, meaning they can travel further, but have a lower max speed of information transfer.

To give even more context radio, like your car radio, is an even Lower frequency, and can travel dozens to hundreds of miles, but can only move enough information to reliably play audio (and maybe some text information about the song).

TL:DR The higher frequency you go, the faster you can transfer information, but the shorter distance it travels.

Anonymous 0 Comments

All wireless communication suffers from something called “attenuation”. Attenuation means a reduction in signal strength. Many things can cause attenuation, but physical objects are one of the more common causes of attenuation when we’re talking about electromagnetic waves.

A physical object can be anything: a wall, a person, a mountain, you name it. Our world is full of them, and they all attenuate the radio waves we use for wifi and cellular communications, not to mention broadcast television signals and radio stations.

An interesting characteristic of radio waves though, is that waves of different frequencies are attenuated different amounts by different physical objects. As a general rule, the higher the frequency, the greater the attenuation a radio wave will suffer from a given object.

WiFi uses 2.4 GHz (2,400 MHz) and 5 GHz (5,000 MHz) bands exclusively. There are no lower frequency bands for it to choose from. Both of these bands are easily attenuated by common building materials, so the walls in your home attenuate your WiFi signal easily. Per the rule of thumb I mentioned earlier, the higher frequency 5 GHz band is more easily attenuated than the lower 2.4 GHz band.

Cellular devices have multiple bands to choose from. Some examples are 700 MHz, 850 MHz, 1,900 MHz (1.9 GHz), and 2,100 MHz (2.1 GHz). Newer wireless technologies like 5G utilize frequencies as high as 52.6 GHz. When you are outside, the device will use a higher frequency band for maximum performance, while switching to lower bands for better indoor performance.

Anonymous 0 Comments

The real answer here is because they were designed that way. Wifi is designed to use public frequency bands and for short ranges so they won’t have a large negative effect on other people using those same frequencies for their own use. Most people don’t want or need their WiFi to span a mile from their house. Cellular technology is built to use reserved frequencies that won’t conflict and is built to travel long distances.

Anonymous 0 Comments

Power and contention.

1. A cellular tower transmits with a power level several thousands of times higher than your wireless access point. A typical cellular tower will transmit an ERP (effective radiated power — the actual transmission power, with antenna gain factored in) of 100-500W while maximum power for a 2.4GHz wifi access point is 100mW; the transmission power of the cellular tower is 1000-5000 times higher. At a range of 1km the signal from a 500W cellular tower will fall to 0.5mW thanks to the inverse square law. Your wifi access point falls to the same level at a distance of just 30m. This is all in open air with no obstructions.

2. It’s likely there are dozens if not hundreds of wifi hotspots and a few times that many client devices around you fighting for space in the radio spectrum, unless you’re well out into the countryside. There will be a half of that number or less of cellular clients though, and less than a dozen cellular towers, and everyone on the cellular network is cooperating to share the space through the protocol. Meanwhile wifi clients on the same channel but talking to different access points are fighting with one another, causing collisions and retransmits that slow down throughput.

Anonymous 0 Comments

Often, larger building will have a DAS (distributed antenna system) installed to provide coverage throughout the whole building. These are small, low powered, wide band antennae forming an array inside, for example, large commercial buildings or highrises. Many malls also have these installed. I have worked on a handful of these installations.

As to why cell providers do it is by pressure from large corporate clients, who insisted coverage improve, or they would switch their entire company’s provider, to someone who would provide coverage, which would be millions in lost revenue a year.

Anonymous 0 Comments

Lucky you if you get High speed downloads indoors at work.

I get zero data connectivity while in my office.

Anonymous 0 Comments

NASA uses ZigBee wifi (900mhz) for communicating between their rover and their drone copter. With the lack of other signals and thinner atmosphere they’re theoretically able to connect over distances of over 2700 feet.

https://www.theverge.com/2021/5/20/22445330/zigbee-on-mars-ingenuity-helicopter-perseverance-rover

Anonymous 0 Comments

Cell towers are way closer than you think. Most of them won’t be several kilometers away, but actually within 1 kilometer and less.

Anonymous 0 Comments

I am a telecoms engineer. I work with Radio Access Networks GSM/LTE/UMTS.

Most of the answers i see here tend to attribute the difficulty in communication on the WiFi case to a higher frequency of the WiFi signal hence a lower penetrability, but that’s only part of the answer.
The truth is that this is just one of the reasons.

There are several bands of LTE which are used on different environmental circumstances depending on what needs to be covered. Lower bands ( less than 1.8GHz) tend to have plenty of penetrability, more than WiFi’s 5GHz, so that’s great over large distances but it wouldn’t penetrate cement so well from many kilometers away.
We actually use higher frequencies (1,8GHz, 2.6GHz etc)in addition to lower bands to cover dense urban areas exactly because of their inability to penetrate, thus avoiding interference caused by over-shooting signals from kilometers away. This is called coverage overlay.
This means that while you may THINK that the LTE signal is coming from afar, depending on the coverage scenario, the station might actually be just few hundred meters away. This is further supported by the fact that mobile communications channels are actually uplink limited, meaning that even if you could get signal in a cement building from an antenna 20Kms away, your phone certainly does not have enough transmission power to reach that station from inside the building.

Which brings me to another reason. User u/alzee76 explains very well in his answer, that the effective power of an LTE antenna is thousands of times higher than your home router, therefore the signal is much much stronger than what you get from a router.

Another reason has to do with the radio wave’s physical properties too. Everytime the signal from the LTE antenna hits a tree, it is reflected into many tiny copies of that same signal which are then radiated into every which direction. This is called scattering. Depending on the environment there could be hundreds of tiny signals (called multipath components) that reach your phone, further increasing the probability that you receive the original message form the LTE antenna. Factor in all the reflections from the objects around you and the probability of getting a strong signal increases.
These multipath components carry very little energy so they dissipate reasonably easy but their effect is noticeable. That’s why you will notice that within a cement building, the best signal you can get is closest to the window.

Finally LTE antennas use some DSP trickery that can i crease the signal reception even further. One of these tricks is just copying the signal and sending it plenty of times from different antennas (also called spatial diversity) or sending it with some milisec delay (also called time-diversity) or automatically steering the signal radiation to the approximate location of the user (called beamforming).

There are more reasons that i may have missed but if you’re looking for a TLDR version just remember this:

Mobile Communications have been designed from the ground up to work over very large distances, utilizing VERY expensive and smart equipment, a lot of power and a multitude of locations.