I’m a little confused about how satellite communication works. My understanding is that there is a Line-of-sight wave from the user to the satellite and then a Line-of-sight wave from the satellite to the destination. I also have heard that whether a radio wave behaves like a ground wave or Line-of-sight wave depends on the frequency of the wave. It’s not intuitive to me why this is. It’s also not intuitive why we don’t communicate with ground waves. Is it because they are slower in spite of the fact that you’re substantially increasing the distance that the data has to travel?
In: 3
Microwave frequencies suffer from something called attenuation. Basically, it hates things with water or metal in them, because it gets absorbed or reflected by it. Trees and leaves have water in them and absorb the frequency. This causes two issues.
First, there forms a dead spot in the broadcast signal as the signal propagates around it. Second. When the signals recombine, it can create more dead spots as the combining signals destructively interferes with itself.
Imagine looking at a wake from a boat on the water as it hits a post in the water. Watch as the waves go around it, leaving a calm area behind the post until the expanding waves meet with each other on the other side. Now notice what happens when the crest and troughs of the waves meet each other. This is essentially what happens to the radio signal.
Line of Sight ensures this issue does not happen.
EDIT: To add more information in relation to your question, sub microwave frequencies, such as 900MHz and lower, do not suffer from this issue quite as much because the there are less waves per second (the frequency) and can avoid it. This lower frequency will instead reflect off of larger bodies on water and other things, such as the ionosphere. This allows global terrestrial communication at the cost of lower available throughput due to less frequency available to put the data on.
Latest Answers