Because they differ so much in energy content, things that are good at receiving one small slice are much, much, much, much better at doing so than things which try to read the whole spectrum all at once.
A radio photon has a wavelength between centimeters and *hundreds of kilometers.* For FM/AM radio, it’s around 2-3 meters wavelength. By comparison, visible light has a wavelength of *hundreds of nanometers.* This means visible light carries approximately six million times more energy than radio waves do.
A detector good enough to give great images won’t have the fineness to distinguish radio waves. A radio wave detector precise enough to distinguish radio waves carefully would be completely blown out if you tried to use it to pick up visible light.
Good comments here already, also consider older remote controls that operated on infrared – infrared is a wavelength of light, but human eyes can’t see that range, but the sensor on your TV can “see” it. The length of the waves of radio energy all function the same, it’s just that the smaller wavelengths are visible as light … A TV antenna can’t “see” the nanometer range of visible light, but it can see the multiple meter long waves of radio energy used to transmit television signals, and your eyeballs can’t.
They’re not identical types of “light.” It’s like asking why liquid water and liquid hydrochloric acid can’t hydrate you if they’re both liquids.
“Light,” broadly speaking as a phenomenon, is radiating electromagnetic energy. Visible light, what your camera takes in and uses to make pictures is comprised of a particular range of radiating energy in the broad electromagnetic spectrum. Radio waves are a different range of the electromagnetic spectrum.
The components of your phones camera are designed and built to be sensitive to and capture the visible light spectrum. They physically cannot capture and depict any other part of the spectrum (although they can be affected by them – for example an electromagnetic pulse might render the electronic components of your camera useless). They just haven’t been tuned/designed to capture and depict other types of electromagnetic radiation. The same is true about your phone’s antenna that picks up and transmits radio waves. The constituent components of the antenna simply can’t pick up or transmit other ranges of the electromagnetic spectrum although they may be affected by it.
Each component is tuned for their purpose.
A fuel pump is not a fuel injector. A fuel hose into a fuel tank is not the same either. Different rates. Different diameters. Different forces.
Electromagnetic waves have very very different frequencies and powers and those interact differently with the sensor materials we make.
Even simple ones like LED/Solarpanels (which are the same mechanism with different tunings) will only weakly work in reverse if at all. And those are in the same frequency.
Trying to get a radio antenna to shoot out X-rays is near impossible.
Plus a device to send receive a stream through the air is entirely different than an array made to capture spatially coordinated waves from a particular direction.
They’re the “same thing” in the same way that a bowling ball and a tank shell are the “same thing”. To carry the analogy forward, “why can’t the bowling alley detect how many pins a 120mm HEAT shell knocks over?”
To go a bit beyond the ELI5, your phone’s camera only detects things with a high enough energy (tank shells) while its antenna looks for a very specific amount of energy so that it can avoid miscommunications where it receives the wrong signal.
The difference in energy between radio waves and visible light is so extraordinarily vast that the principles behind each device are totally different.
I read this the other day:
“Scientists have transformed WiFi routers into ‘cameras’ capable of seeing people through walls.
Researchers at Carnegie Mellon University have discovered a method for detecting the 3D shapes and movements of individuals in a room using only WiFi routers-no cameras or
LiDAR required.
According to the researchers, WiFi signals ‘could serve as a universal substitute’ for traditional RGB cameras in ‘sensing’ people in a room.
They noted that using WiFi bypasses issues like poor lighting and occlusion that conventional camera lenses encounter. Intriguingly, they claim that this advancement actually enhances privacy rights. ‘In fact, they state,
‘most households in developed countries already have WiFi at home, and this technology could potentially be scaled to monitor the wellbeing of the elderly or to identify suspicious activities at home.”
Two big reasons:
1. Most people are mentioning that these have different energies and thus different wavelengths. An antenna must generally be physically the same size as the wavelengths it hopes to catch. Visible light wavelengths are smaller than a bacterium. 5g wavelengths are the width of a baseball.
2. The camera and antenna have vastly different designs for vastly different needs. The camera sensor is a big grid of individual sensors so that it can collect simultaneous signals from lots of different directions (each direction has 3 different color sensors, making up every single pixel). The antenna doesn’t need to be as precise about direction as the camera but it needs to be able to deal with a signal changing much more quickly!
Latest Answers