Let’s say I have a computer plugged in, or a microwave, or phone charger plugged in. Isn’t the point of a device being off is that it’s not supposed to be using electricity? How much watts of power am I looking at that’s being consumed per hour?
Where does this apply and where does it not? Shouldn’t I try to unplug everything as much as possible to save money?
In: 101
Some electronics just have a standby mode. In the case of the microwave it needs some power to display the time. Others will check for updates in the background, monitor statuses, stay charged to reduce power on time, and be ready to respond to inputs.
For computerized electronics they all have a power circuit that converts the higher AC wall power to a much lower DC power. This is what is inside the power brick of a phone charger. This conversion circuit is a complete circuit that will always consume some power just due to the nature of the circuit setup.
The power conversion circuit consists of a transformer, which can step voltages up or down. The transformer has a wire that comes in from the hot, coils around a magnetic core of some sort, and goes out to the neutral. The other side of the magnetic core will have a similar wire setup that will have more or less wire coiled around. Since the input is just a coil of wire, it works as a complete circuit and allows electricity to flow through it, consuming some power even though nothing is plugged in on the other side.
This was a bigger problem on old power bricks as they could use a couple watts of power even though nothing was being powered by them.
Modern/good quality power bricks are smart enough to shunt this flow when nothing is plugged into the brick to use the power, but they still need some power to monitor things so it can turn things back on to charge once it’s plugged in. This monitoring is usually a fraction of a watt though and barely noticeable compared to the old or cheap bricks.
Latest Answers