I get that, for a house/solar battery, it sort of makes sense as your typical energy usage would be measured in kWh on your bills. For the smaller devices, though, the chargers are usually rated in watts (especially if it’s USB-C), so why are the batteries specified in amp hours by the manufacturers?
In: 7
Small batteries are often single-cells (Lithium ion or LiPo, nominal 3.7V) whereas large batteries never are so with large batteries you have to use Wh otherwise if they used mAh, the amount of energy would be totally dependent on the arrangement of the cells (parallel vs series). Personally, I find the use of mAh annoying even when a single cell as the amount can vary depending on chemistry (high voltage Lipo can have 3.8V nominal, Lithium Iron Phosphate is increasingly common for stationary applications and is safer… has a 3.2V nominal voltage), and oftentimes we’re abstracting away the mAh of the cell anyway because we’re boosting from the 3.7V of the single cell (typical LiPo or non-LFP Lithium Ion) to the 5V of USB or whatever.
Plus, oldschool NiMH cells (typical rechargeable AA or AAA) have a nominal 1.2V, and a fully charged high voltage LiPo may be safely charged to 4.35V…
Back in the day, the lazy electrical engineering solution to making sure electronics got a consistent voltage from batteries was to slap a voltage regulator on the input. All that does is cap the voltage output at a certain value (that depends on the circuitry you need to power… and should be less than the voltage of an almost-drained battery under load), and any input voltage above that is thrown away as heat. So the voltage of the input battery doesn’t matter as long as it’s above the minimum amount needed for the circuit, and the only relevant capacity value is mAh (and using Wh would actually be less than helpful). Nowadays, nice solid state electronics are cheap so we use DC-DC converters (which can Boost the voltage up or down depending on application) that adjust the output to be whatever the circuit needs, so a higher voltage battery will need less current draw, in which case we should all be using Watt-hours (or maybe Joules, if you’re the pedantic type… that’s the same as Watt-seconds). This is much more efficient. It’s also why you can use a single 1.5V AA alkaline battery for a nice LED flashlight (white LEDs have a voltage drop of 3.5 to 5V) instead of needing 3 AAAs, and it can drain all the energy out without a change in brightness over time.
So I guess tl;dr… our electronics used to be less efficient (make more heat instead of useful work), so we used mAh. Our better electronics can trade voltage for current to meet the needs of the circuits regardless of battery voltage, so we should use Wh now, as we do for large batteries.
Latest Answers