Eli5: Free power from radio waves?

377 views

You know those crystal sets that people played with many years ago? They enabled you to listen to radio using a high impedance earphone, without a battery, because the radio waves induce a small voltage in the circuit (forgive me if I don’t understand it fully)

Well, I’ve often wondered why thousands or even millions of these circuits couldn’t be connected in series to give us some free meaningful power. If someone could explain..

In: 0

5 Answers

Anonymous 0 Comments

Those radio waves did not simply appear out of nowhere. Every bit of power that the receiver obtains is part of the power sent out by the broadcasting antenna. So there is no “free” energy there – it came from something that was designed to send out that energy.

There is also the problem of scale. A small antenna gets perhaps a milliwatt or more likely several hundred microwatts. A million of them working perfectly efficiently (which it won’t – since they would interfere if too close together) delivers less than a kilowatt (an average household uses more than that). So much energy would be lost in transmission that it would take the transmitter tens or hundreds of kilowatts to even achieve this FOR ONE HOUSE.

In other words, very expensive, very inefficient, very large and not very useful. We haven’t even touched on the dangers of scaling it up to handle more than one home.

Anonymous 0 Comments

There is no such thing as a free lunch. The energy conversions of those setups experiences a lot of loss, a quick google shows they only get around 30-40% efficiency. This means that for every watt of power you want the receiving device to get the transmitter has to pump out around 2.5 watts.

And you still have to generate the radio waves that are being used to power the devices. So instead of getting a free and universal power supply you are getting an extremely expensive and inefficient power system. Not only does the transmitter have to pump out enough power to supply the devices, it has to pump that power out in all directions regardless of if a device is there to receive the energy.

If you try to use background radio noise your devices are limited to that level of power generation.

Anonymous 0 Comments

Nikola Tesla had the same idea. Unfortunately the power from those radiowaves ultimately comes from the radio transmitter and the majority of the energy is lost. It is a very inefficient way of moving power, you’d need a small power plant and a bery large receiver to wirelessly power a few LEDs or charge a phone

Anonymous 0 Comments

There’s a problem that you and Tesla both share.

Not knowing the inverse square law of RF…..

For every meter you go away from the transmitting antenna the effective power goes down by a 1/4 … That’s very simplified*** the actual formula is closer to a doubling in distance you quarter the power.

So, that being said. You would have to be super close to an antenna to draw any real power off of it, which we already do in consumer wireless charging of phones , even inductive cooktops could fall into that category too.

There’s just too much loss in the air to make it practical/ affordable over long distances.

Anonymous 0 Comments

Ignoring all the other issues, it isn’t free power. You’re simply stealing power from the radio station. If you got a significant amount of power this way, it would disrupt the radio transmission, the FCC would find you, and you would have to try to explain yourself in Federal court.