I get that, for a house/solar battery, it sort of makes sense as your typical energy usage would be measured in kWh on your bills. For the smaller devices, though, the chargers are usually rated in watts (especially if it’s USB-C), so why are the batteries specified in amp hours by the manufacturers?
In: 7
*A lot of answers dunking on **Ah**.*
**A** stands for Ampere a unit of **intensity of electric current (I)**. Electric current is defined as the time rate of flow of **charge (Q)**, i.e. amount of charge flowing per **unit of time (t)**. Or **I = Q/t** rearranging **Q = It**.
So, multiplying **Current unit** with **time unit** gives you an idea of the amount of charge, hence Ampere-hour or **Ah** is a measure of amount of charge.
—
kW is a unit of power, or energy consumed in a unit of time. So similar to the previous excerpt, multiplying kW and hours gives you a measure of the amount of energy stored. Also ypur home energy meter uses that very unit
—
The reason to use one or another is mostly because of what needs measuring, I suppose. For a handheld device, “how much charge is left” is more important. People reasonably believe a high end phone would draw more charge so you would need a battery with more charge. I’m guessing here but I think the mAh unit for battery leaked into general vernacular from technical specs used within the industry.
kWh communicates the amount of energy and since most electrical devices are rated in kW it makes it easy to surmise how much a battery can provide backup. Again a guess.
Latest Answers