I’m a little confused about how satellite communication works. My understanding is that there is a Line-of-sight wave from the user to the satellite and then a Line-of-sight wave from the satellite to the destination. I also have heard that whether a radio wave behaves like a ground wave or Line-of-sight wave depends on the frequency of the wave. It’s not intuitive to me why this is. It’s also not intuitive why we don’t communicate with ground waves. Is it because they are slower in spite of the fact that you’re substantially increasing the distance that the data has to travel?
In: 3
Microwave frequencies suffer from something called attenuation. Basically, it hates things with water or metal in them, because it gets absorbed or reflected by it. Trees and leaves have water in them and absorb the frequency. This causes two issues.
First, there forms a dead spot in the broadcast signal as the signal propagates around it. Second. When the signals recombine, it can create more dead spots as the combining signals destructively interferes with itself.
Imagine looking at a wake from a boat on the water as it hits a post in the water. Watch as the waves go around it, leaving a calm area behind the post until the expanding waves meet with each other on the other side. Now notice what happens when the crest and troughs of the waves meet each other. This is essentially what happens to the radio signal.
Line of Sight ensures this issue does not happen.
EDIT: To add more information in relation to your question, sub microwave frequencies, such as 900MHz and lower, do not suffer from this issue quite as much because the there are less waves per second (the frequency) and can avoid it. This lower frequency will instead reflect off of larger bodies on water and other things, such as the ionosphere. This allows global terrestrial communication at the cost of lower available throughput due to less frequency available to put the data on.
Ground following waves have low frequency, we talk about 3MHz and below. So you have a total spectrum of 3MHz to transmit data it will be shared by all radio waves. The lower the frequency the better they floow earth
WiFi uses 20MHz or multiple of 20Mhz to transmit data so datarate you can manage at 3MHz is quite low.
The signal is also attenuated with distance because earth’s surface is not a preferred conductor. So you need high power. The signal will also travel a lot better of the sea than over land.
The required antenna size also depends on the wavelength and they will be in kilometers for low-frequency waves, antennas need in that size order too.
Ground waves have been used and are still used for long-distance communication. There was cross-Atlantic radio transmission for Morse code messages starting in 1901. The had high power, huge antennas and low data rate.
Because there are better alternatives it is seldom used today, There is still limed usage for communication with submarines that can receive signal underwater. Sometimes signals are sent that way to.
In no particular order, some have been addressed in other answers:
– The ionosphere is called that way because it gets ionized by solar UV radiation, freeing electrons from the atoms. Radio transmissions at frequencies between a few MHz and about 30MHz are reflected by those free electrons. You can take advantage of that to reflect waves back down and overcome Earth’s curvature, but it’s a disadvantage if you want to pierce through the ionosphere.
– On the other hand, frequencies above 30GHz are heavily attenuated by lower atmospheric layers, mainly the air in the troposphere.
– In between lays what’s called the radio window.
– At frequencies even lower than a MHz, waves can bounce back and forth between the ionosphere and the surface, traveling even further.
– In general, higher frequencies are attenuated more by the atmosphere.
– High frequencies also have more room for wide channels. It’s easy to allocate 20MHz wide channels around a 2.4GHz frequency. At 30MHz that doesn’t work well.
– A higher number of wider channels is what allows high data rates. Counter-example – submarine transmissions at a few dozen kHz can have data rates below 100 bits/s.
– Low frequencies need big antennas.
– A satellite has to be very frugal with the amount of energy it uses to transmit data, so the power received at the antenna on Earth’s surface is mindblowingly low. The sensitivity of modern GPS receiver chips is in the order of magnitude of billionths of a billionth of Watt. Any additional obstruction of the path can disrupt the channel.
A combination of the factors above makes authorities decide what frequency ranges to allocate for which uses. All of that is then used by whomever plans an application with radio waves to choose the most fitting transmission, as long as there’s any choice left. Ground waves offer a very limited amount of narrow, slow channels that can reach very far. Very few could use it and they would make those frequencies unusable by anyone else on the same continent, if they took advantage of the possible range.
Ok so you have multiple things happening:
* Radio waves travel like light beams, in a straight line. So if you’re on this side of the Earth and you want to communicate with Australia on the other side of the Earth, you’ll need satellites to receive and re-transmit your radio waves.
* Radio waves are absorbed by certain materials. Dirt (the ground), water, metal objects such as skyscrapers, etc. Ideally you want to send your radio waves through the air or into space, rather than through the ground, water, or buildings.
* The [ionosphere](https://qph.cf2.quoracdn.net/main-qimg-8967a882f8c568b924096d6817b15eff-lq) outer layer of the atmosphere is ionized and reflects radio waves, a little bit like water reflects light. If you send straight up it gets through, but if you send at an angle it reflects back down to the ground.