If i leave my charger plugged in and the switch is on without any device charging, does it ‘waste’ electricity? Why/why not?

52 views
0

If i leave my charger plugged in and the switch is on without any device charging, does it ‘waste’ electricity? Why/why not?

In: 58

If your charger has lights on, or other indicators, then it indeed wastes a tiny amount of electricity (one AA battery worth over years).

Otherwise, not. This is because without the battery/device attached for charging, the circuit is not complete and electricity is not flowing, energy is not being consumed and not being expended into light/heat.

Generally, yes, there will be a small amount of electricity flowing. However, it is so small that it amounts to pennies per year in cost. If you unplugged everything you weren’t using and only plugged it in to use it, there would be no notable reduction in your electricity bill. Most of your bill is major appliances and heating/air conditioning.

The old AC adapters/chargers that were basically just a transformer + bridge rectifier used to waste a bit of electricity. They would produce heat even when not in use. Modern chargers are a lot more efficient and probably use a negligible amount of current when not powering anything.

Yes, the most coming from LEDs, but also a fair amount coming from the ICs that drive the charger. This is for a few reasons.

One being that some of these ICs might use redumentary voltage dividers to bias a certain input to a specific voltage. You can look up what a voltage divider is, but the jist is that it brings a high voltage down to a lower voltage to be used somewhere else in the circuit.

Another reason so that these controllers will have bypass capacitors that filter out the higher frequency electrical noise from the power source. This will use power for the same reason above, its a passive device that isn’t turned off or on. Just providing clean power to the IC thats on ‘standby’ mode

The final reason is actually that ‘standby’ mode. Its usually rated at a few μA of current. This is due to the intrinsic nature of the MOS transistors in the IC, and how they’re implemented to build the chip. This is pretty not ELI5, but its as close as we can get given that each company has their own chip that works in its own specific way.

remember, cost of electricity with taxes, fees, etc. amounts to about $0.20/kWh (20 cents per kilowatt-hour) at most, for much of the continental US.

A very very high quiescent (passive) current draw for a cell phone charger would be 10mA (0.01 amperes) which is 120V × 0.01A = 1.2W (watts). There are 24×30= 720 hours in a month. 1.2W = 0.0012kW. So the total energy consumption in a month would be 0.0012kW × 720h = 0.864kWh.

At 20 cents per kWh, that’s a whopping 17 cents for the month to leave a very crappy charger plugged in continuously. Even accounting for high peak time rates, and a higher overall rate state like California, it would still be under a dollar.

So from a monetary perspective, no it’s not a waste.

If you mean from a “green” perspective, no, still not a waste. Losses and inefficiencies from power generation and transmission dwarf things like this.

I realize this isn’t a very eli5 answer but it’s not something that can really be explained ly5