How does my 5V1A bluetooth headphones draw 1A from my 5V2A charger when it’s charging?


How does it draw the exact current it needs? Is it as simple as that because my headphones just have resistance of 5 ohm?

And also, my 5V2A charger can charge any 5V devices that draw current equal to or below 2A right?

In: 0

Yes, and yes. The current drawn is determined by the resistance of the device drawing it, regardless of how much current the power supply could produce without a drop in voltage.

Edit – to be clear this applies to dumb chargers/power supplies, not things with circuits that are able to switch over to trickle charging, shut off or adopt some other strategy depending on the device it is supplying.

Ok I will try to ELY5:
Imagine the charging circuit als a water pipe. The Volt give you the pressure your pipe can withstand without bursting and the amperage give you a rating of how tick (inner diameter) the pipe is. If you have a tap the has a pressure of 5V and a tap size of 2 Amps there is enough water to fill your 1 Amp pipe. And since the pressure from the tap matches your pipes rating, it won’t blow up.
If you now have a device that needs 5V 3Amps you can still try to charge it with your charger since the pressure (volt) rating is the same but depending on the device it will charge slower or not at all.

What we call chargers are actually voltage sources. This means that the little thing you stick in the wall outputs a specific voltage +/- a small amount. The device connected to it will draw whatever current it draws. A well designed power supply will limit the output current so that its electronics are not damaged and will not start a fire.

The amperage rating on a voltage source just means that it will limit the output current to that current. On a poorly designed voltage source that 2A may just be a suggestion, and trying to pull more than 2A can cause the charger to be damaged or catch fire.

Since most consumer devices are designed for use with voltage sources, as long as the voltage matches and the required current is less than the stated amperage you are good to go. There are exceptions to this, though they are quite rare.

Voltage, resistance, and amperage (current) are directly interlinked with a simple equation (V = IR, where I is current because reasons). For this question, all that really means is that if you lock down two of these to specific values, then the third one must also become locked down since only one possible answer can complete the equation.

Voltage is supplied at a constant(ish) value from your charger block. And your headphones are constructed to have a specific natural resistance. So, generally, if you connect the headphones to a circuit of the correct voltage, the desired current *must* flow through it.

In real life this is, as you can probably guess, not so clean cut. For one, your device’s resistance is almost certainly not a fixed value. A battery has more resistance the fuller it is, so a charging battery can pull more current when it is low than it can when it is mostly filled up. Additionally, charging circuits may contain components that can change the device’s effective resistance on the fly, allowing the device to actively control how much current it draws.

This leaves one last detail: the amp rating on the charger. If the device pulls whatever current it wants, what does the current on the charger mean? It’s simply the max current that the charger is designed to handle. Exceed it, and the charger may heat up and melt, or even catch fire.