Why do plugged in chargers, which aren’t connected to anything, still draw power?

778 views

e.g. If a phone charger is plugged into an outlet yet has no phone connected to be charged, why does it draw power?

In: Technology

6 Answers

Anonymous 0 Comments

The charger itself has circuitry that needs to be powered. It’s very little, though. Mostly just to detect when something has been plugged in so it can turn on. Much like a TV needs to be on to read the “turn on” signal from the remote.

Anonymous 0 Comments

Sometimes they contain capacitors that help to change the voltage coming from the wall to match what you need for your phone.

When you plug it in, the capacitors accept charge from the wall. To keep you safe when you unplug it from the wall, the engineers added small resistors to bleed off the capacitors.

The resistors are connected all the time, so even when your phone is not hooked up, they still bleed off the capacitors and those capacitors get immediately refilled by the wall outlet.

Anonymous 0 Comments

Because it’s often a transformer circuit, it’s still inducing a magnetic field and there is still resistance and current moving through the circuit. Voltage loss from the wheatstone bridge, too, most likely.

There’s basically a “circuit” working even if there are no electronic chips in it, and yet things like USB chargers often have actual electronic circuits in them too.

Because of that , they are always (tiny) losses in that circuit, which produce heat and magnetism and cause power to be drawn.

Anonymous 0 Comments

They don’t anymore. If it says “switching power supply” it shuts off when not in use.

Older ones drew power because it has to drop 120v or 240v down to 5v, and convert ac to dc. The circuitry draws power whether it’s being used or not.

Newer ones have a capacitor, similar to a battery that once full, triggers the device to shut down. As soon as you plug in your phone it draws power from that capacitor, triggering the charger to start powering again.

Anonymous 0 Comments

Modern chargers draw very little power when just plugged in. They have active power-electronics (a high-frequency switching rectifier) on the input.

Older chargers would have a big 60 Hz transformer to step down the voltage followed by a rectifier to make it DC. The transformer would always see wall voltage, and the core would be magnetized back and forth at 60 Hz regardless of load. This caused hysteresis loss (almost like friction of the magnetic domains flipping back and forth) that was not so tiny. I am pretty sure new products are not allowed to do that any more because of this.

Anonymous 0 Comments

Are you saying that because you have measured it, or because you believe that to be true? The no-load consumption of a modern switch-mode power supply (and all chargers are switch-mode) is very low, and below the ability of the typical domestic power meter to measure.

A test here

[https://www.howtogeek.com/231886/tested-should-you-unplug-chargers-when-youre-not-using-them/](https://www.howtogeek.com/231886/tested-should-you-unplug-chargers-when-youre-not-using-them/)

managed to find 0.3W when connecting six chargers, three laptop, three phone/tablet, so an average of 0.05W each.

This is ELI5, so…

Imagine it’s a hot day, and you want to keep a water bowl for your dog filled up, so there is always something him to drink. But the only water supply you have is a fire hydrant. You can fill the bowl by flashing the hydrant open for a fraction of a second. If your dog is drinking a lot, then you need to do that quite often to keep the bowl full. But if your dog is off doing something else, you will only need to let water in from the main very occasionally, as the bowl dries out on a sunny day.

That’s how a switch-mode power-supply works, It lets mains electricity in for a short period, to fill a bowl (a “capacitor”). If something is plugged in (“the dog is drinking”) it has to repeat that process frequently. But if nothing is plugged in, then it only has to make up the losses caused by the capacitor leaking.

Most of the “vampire power” nonsense stems from people who think that power supplies are transformers. They were, thirty years ago, but today they’re all switch-modes (the “flash the mains, charge a capacitor” process). I don’t know about the US, but in Europe there _is_ a transformer in a power supply, because you have to have galvanic isolation between input and output for safety reasons, but it’s not being driven while the capacitor is full, so its efficiency is almost irrelevant when there is no load.