Eli5:Why is it that cell phones , with their tiny antenaes are able to download files at LTE speeds inside concrete buildings from cell towers kilometers away yet getting wifi to work behind a corner is an exercise in frustration?


Eli5:Why is it that cell phones , with their tiny antenaes are able to download files at LTE speeds inside concrete buildings from cell towers kilometers away yet getting wifi to work behind a corner is an exercise in frustration?

In: 2387

It is not that hard to get wifi to work over several kilometers either, just as long as there is no corners. Actually the longer distances allow the signals to go parallel to the walls so they can go through the least amount of concrete and rebar as possible. In addition to this most taller buildings have the majority of concrete and rebar in its center columns thus allowing signals through the outer walls but not from one side of the building to the next. If you walk around the corner of the building you are probably no longer connected to the same cell tower. People do not notice this because there are way more cell towers then most are aware of, they hide them well. While a typical office building need about four wifi access points per floor to provide good access it typically connects to over 10 cell towers within half a kilometer radius. So no matter where you are in relation to the columns of the building you are likely in sight of one of them.

Because they are working on two very different technologies.

They use different waves of radio frequencies. Cell Phones operate in terms of Mega Hertz, Wifi Operates in terms of Giga Hertz.

Since wifi uses a much higher frequency, when you have a good wifi connection you can transfer way more information much faster than cellular is capable of.

The drawback is that because wifi uses this much faster frequency, it can not travel as far and is blocked relatively easily.

Cellular systems operate at a lower frequency, meaning they can travel further, but have a lower max speed of information transfer.

To give even more context radio, like your car radio, is an even Lower frequency, and can travel dozens to hundreds of miles, but can only move enough information to reliably play audio (and maybe some text information about the song).

TL:DR The higher frequency you go, the faster you can transfer information, but the shorter distance it travels.

All wireless communication suffers from something called “attenuation”. Attenuation means a reduction in signal strength. Many things can cause attenuation, but physical objects are one of the more common causes of attenuation when we’re talking about electromagnetic waves.

A physical object can be anything: a wall, a person, a mountain, you name it. Our world is full of them, and they all attenuate the radio waves we use for wifi and cellular communications, not to mention broadcast television signals and radio stations.

An interesting characteristic of radio waves though, is that waves of different frequencies are attenuated different amounts by different physical objects. As a general rule, the higher the frequency, the greater the attenuation a radio wave will suffer from a given object.

WiFi uses 2.4 GHz (2,400 MHz) and 5 GHz (5,000 MHz) bands exclusively. There are no lower frequency bands for it to choose from. Both of these bands are easily attenuated by common building materials, so the walls in your home attenuate your WiFi signal easily. Per the rule of thumb I mentioned earlier, the higher frequency 5 GHz band is more easily attenuated than the lower 2.4 GHz band.

Cellular devices have multiple bands to choose from. Some examples are 700 MHz, 850 MHz, 1,900 MHz (1.9 GHz), and 2,100 MHz (2.1 GHz). Newer wireless technologies like 5G utilize frequencies as high as 52.6 GHz. When you are outside, the device will use a higher frequency band for maximum performance, while switching to lower bands for better indoor performance.

The real answer here is because they were designed that way. Wifi is designed to use public frequency bands and for short ranges so they won’t have a large negative effect on other people using those same frequencies for their own use. Most people don’t want or need their WiFi to span a mile from their house. Cellular technology is built to use reserved frequencies that won’t conflict and is built to travel long distances.

Power and contention.

1. A cellular tower transmits with a power level several thousands of times higher than your wireless access point. A typical cellular tower will transmit an ERP (effective radiated power — the actual transmission power, with antenna gain factored in) of 100-500W while maximum power for a 2.4GHz wifi access point is 100mW; the transmission power of the cellular tower is 1000-5000 times higher. At a range of 1km the signal from a 500W cellular tower will fall to 0.5mW thanks to the inverse square law. Your wifi access point falls to the same level at a distance of just 30m. This is all in open air with no obstructions.

2. It’s likely there are dozens if not hundreds of wifi hotspots and a few times that many client devices around you fighting for space in the radio spectrum, unless you’re well out into the countryside. There will be a half of that number or less of cellular clients though, and less than a dozen cellular towers, and everyone on the cellular network is cooperating to share the space through the protocol. Meanwhile wifi clients on the same channel but talking to different access points are fighting with one another, causing collisions and retransmits that slow down throughput.

Often, larger building will have a DAS (distributed antenna system) installed to provide coverage throughout the whole building. These are small, low powered, wide band antennae forming an array inside, for example, large commercial buildings or highrises. Many malls also have these installed. I have worked on a handful of these installations.


As to why cell providers do it is by pressure from large corporate clients, who insisted coverage improve, or they would switch their entire company’s provider, to someone who would provide coverage, which would be millions in lost revenue a year.

Lucky you if you get High speed downloads indoors at work.

I get zero data connectivity while in my office.

NASA uses ZigBee wifi (900mhz) for communicating between their rover and their drone copter. With the lack of other signals and thinner atmosphere they’re theoretically able to connect over distances of over 2700 feet.


Cell towers are way closer than you think. Most of them won’t be several kilometers away, but actually within 1 kilometer and less.

I am a telecoms engineer. I work with Radio Access Networks GSM/LTE/UMTS.

Most of the answers i see here tend to attribute the difficulty in communication on the WiFi case to a higher frequency of the WiFi signal hence a lower penetrability, but that’s only part of the answer.
The truth is that this is just one of the reasons.

There are several bands of LTE which are used on different environmental circumstances depending on what needs to be covered. Lower bands ( less than 1.8GHz) tend to have plenty of penetrability, more than WiFi’s 5GHz, so that’s great over large distances but it wouldn’t penetrate cement so well from many kilometers away.
We actually use higher frequencies (1,8GHz, 2.6GHz etc)in addition to lower bands to cover dense urban areas exactly because of their inability to penetrate, thus avoiding interference caused by over-shooting signals from kilometers away. This is called coverage overlay.
This means that while you may THINK that the LTE signal is coming from afar, depending on the coverage scenario, the station might actually be just few hundred meters away. This is further supported by the fact that mobile communications channels are actually uplink limited, meaning that even if you could get signal in a cement building from an antenna 20Kms away, your phone certainly does not have enough transmission power to reach that station from inside the building.

Which brings me to another reason. User u/alzee76 explains very well in his answer, that the effective power of an LTE antenna is thousands of times higher than your home router, therefore the signal is much much stronger than what you get from a router.

Another reason has to do with the radio wave’s physical properties too. Everytime the signal from the LTE antenna hits a tree, it is reflected into many tiny copies of that same signal which are then radiated into every which direction. This is called scattering. Depending on the environment there could be hundreds of tiny signals (called multipath components) that reach your phone, further increasing the probability that you receive the original message form the LTE antenna. Factor in all the reflections from the objects around you and the probability of getting a strong signal increases.
These multipath components carry very little energy so they dissipate reasonably easy but their effect is noticeable. That’s why you will notice that within a cement building, the best signal you can get is closest to the window.

Finally LTE antennas use some DSP trickery that can i crease the signal reception even further. One of these tricks is just copying the signal and sending it plenty of times from different antennas (also called spatial diversity) or sending it with some milisec delay (also called time-diversity) or automatically steering the signal radiation to the approximate location of the user (called beamforming).

There are more reasons that i may have missed but if you’re looking for a TLDR version just remember this:

Mobile Communications have been designed from the ground up to work over very large distances, utilizing VERY expensive and smart equipment, a lot of power and a multitude of locations.

For the same reason you can sit in the shade at the park and still be able to read a book by sunlight but if you were trying to read in a cave with a lantern you would need it right next to you rather than around the nearest corner.

One of those sources is MASSIVELY more powerful despite being quite a bit further away

Simplified version: you can either have *faster* speeds or a *farther* connection. You sacrifice one for the other.

As you’ve noticed yourself, with cellular you get much farther distance but at the cost of speed.

3 main reasons:

1) you are comparing a use case (wifi) with two small devices with relatively bad antennas transmitting in all directions at low power with another case (4G) where on one side you still have the bad device, but on the other side you have a very effective antenna, transmitting just over a narrow angle and at a power orders of magnitude higher. Obviously the second case works better.

2) 4G frequency is usually lower than the WiFi frequency. And the lower the frequency, the better it travels through stuff. The drawback is that the lower it is, the less data you can transmit (yes, I know, that sentence is almost wrong, it has a lot of shortcuts).

3) 4G frequencies are used by a handful of actors with properly dedicated frequencies so it is usually a fairly clean environment. On the other hand, the WiFi frequency is basically a free for all with anyone being able to transmit over it (under specific conditions of course). This means you would have a lot more devices trying to talk over each other. Your phone is talking to your router but your neighbours phone talking to their router is also using the same frequency and everyone have to share it. There are WiFi mechanisms for this but if it gets too crowdy, then it impacts the speed.

The answer is they don’t. Commercial concrete buildings usually have multiple “micro cell” sites installed throughout the building to support cell phone data and voice communication. If this wasn’t done the density of cellphones would quickly overwhelm most cell phone towers in cities.
Cell phone towers have limits to the number of simultaneous connections. 5G requires even shorter distances than LTE, so there has been a major increase in the number of cell towers from previous needs. Here’s some good apps and websites to see just how many are around you. You won’t believe how many are located at Sport Stadiums

Most hotspots have a damn weak broadcast power! So the further you are, the nore likely your data transfer speeds are equaling those of the 1990’s.

Cell phone towers are large massive towers that are built to provide cellular signal to large areas of the city. They have miles in range and use a lot of power.

Your wifi router plugs into the wall and is made to NOT be so powerful as to extend into other buildings and cause interference with other wifi routers. Their range is 200-300 feet when there is no walls between the router and the device using it.

So by design they have different ranges because by design they serve different purposes.

There are many good explanations of different factors at play here (frequencies, number of routers/cell towers, signal power, etc), but I’d like to clear up a couple of assumptions you made here. In urban areas, cell towers are generally not kilometers away. And when cell towers *are* kilometers away, you won’t get your typical LTE speeds, it will be slower. There is always a tradeoff between connection speed and signal range – the higher the signal range, the lower the speed, and vice-versa. Of course, every new generation generally increases the speed for a certain range, but that truth still stays. I mean, think about it, if you didn’t have that tradeoff, we would all (probably) be using satellite internet.

Also, Concrete buildings don’t necessarily let cell signals through. What most concrete buildings have, though, are windows – and they do let signals through. But if you’re in a concrete building without any windows, and there’s nothing to bring the cell signal there, chances are, you won’t get any reception.

Cell towers are absolutely BLASTING their signal–they have permission from the FCC do to this, and it works. WiFi is meant to be a personal or smaller-scale thing, and works at far lower power levels. As a result, the signals are easier to block.

If you want a fun read, google “cantenna” for a cool way to boost your signal.

I wasn’t satisfied with the other answers. This is my best Eli5. *Source: used to work at AT&T.*

– Your phone usually hums a low tone (under 1ghz) which can penetrate most walls with ease and reach a cell tower that’s listening miles away. Your phone has to compete with all of the other phones and cell towers talking within miles of you, but luckily every phone and tower abides by strict rules and has a ton of space to adjust their pitch so they know who is talking.
– On the other hand, with Wi-Fi and Bluetooth, your phone has to use a shrill voice which barely reaches the other room to talk to your router (2.4ghz or 5ghz). As it happens, everything else in your house is forced to talk at only those 2 high pitches and it’s really hard for everyone to hear each other. Kind of like a school cafeteria at lunchtime. Even your microwave, when running, has to use this same shrill pitch.

T-Mobile service used to really suck in comparison to Verizon and AT&T because it was only licensed to talk in a relatively high pitched voice. Luckily, the government let them start humming that low, penetrating frequency like the other carriers, in exchange for a bag of money, of course.

You fucking serious? You don’t know how cell phones and wi fi works?. Wifi is a limited signal coming from a small router. Not very powerful. A cell phone works off a giant tower.

How do you not know how the thing you carry in your pocket everyday works?