I get that, for a house/solar battery, it sort of makes sense as your typical energy usage would be measured in kWh on your bills. For the smaller devices, though, the chargers are usually rated in watts (especially if it’s USB-C), so why are the batteries specified in amp hours by the manufacturers?
In: 7
*A lot of answers dunking on **Ah**.*
**A** stands for Ampere a unit of **intensity of electric current (I)**. Electric current is defined as the time rate of flow of **charge (Q)**, i.e. amount of charge flowing per **unit of time (t)**. Or **I = Q/t** rearranging **Q = It**.
So, multiplying **Current unit** with **time unit** gives you an idea of the amount of charge, hence Ampere-hour or **Ah** is a measure of amount of charge.
—
kW is a unit of power, or energy consumed in a unit of time. So similar to the previous excerpt, multiplying kW and hours gives you a measure of the amount of energy stored. Also ypur home energy meter uses that very unit
—
The reason to use one or another is mostly because of what needs measuring, I suppose. For a handheld device, “how much charge is left” is more important. People reasonably believe a high end phone would draw more charge so you would need a battery with more charge. I’m guessing here but I think the mAh unit for battery leaked into general vernacular from technical specs used within the industry.
kWh communicates the amount of energy and since most electrical devices are rated in kW it makes it easy to surmise how much a battery can provide backup. Again a guess.
*A lot of answers dunking on **Ah**.*
**A** stands for Ampere a unit of **intensity of electric current (I)**. Electric current is defined as the time rate of flow of **charge (Q)**, i.e. amount of charge flowing per **unit of time (t)**. Or **I = Q/t** rearranging **Q = It**.
So, multiplying **Current unit** with **time unit** gives you an idea of the amount of charge, hence Ampere-hour or **Ah** is a measure of amount of charge.
—
kW is a unit of power, or energy consumed in a unit of time. So similar to the previous excerpt, multiplying kW and hours gives you a measure of the amount of energy stored. Also ypur home energy meter uses that very unit
—
The reason to use one or another is mostly because of what needs measuring, I suppose. For a handheld device, “how much charge is left” is more important. People reasonably believe a high end phone would draw more charge so you would need a battery with more charge. I’m guessing here but I think the mAh unit for battery leaked into general vernacular from technical specs used within the industry.
kWh communicates the amount of energy and since most electrical devices are rated in kW it makes it easy to surmise how much a battery can provide backup. Again a guess.
Small batteries are often single-cells (Lithium ion or LiPo, nominal 3.7V) whereas large batteries never are so with large batteries you have to use Wh otherwise if they used mAh, the amount of energy would be totally dependent on the arrangement of the cells (parallel vs series). Personally, I find the use of mAh annoying even when a single cell as the amount can vary depending on chemistry (high voltage Lipo can have 3.8V nominal, Lithium Iron Phosphate is increasingly common for stationary applications and is safer… has a 3.2V nominal voltage), and oftentimes we’re abstracting away the mAh of the cell anyway because we’re boosting from the 3.7V of the single cell (typical LiPo or non-LFP Lithium Ion) to the 5V of USB or whatever.
Plus, oldschool NiMH cells (typical rechargeable AA or AAA) have a nominal 1.2V, and a fully charged high voltage LiPo may be safely charged to 4.35V…
Back in the day, the lazy electrical engineering solution to making sure electronics got a consistent voltage from batteries was to slap a voltage regulator on the input. All that does is cap the voltage output at a certain value (that depends on the circuitry you need to power… and should be less than the voltage of an almost-drained battery under load), and any input voltage above that is thrown away as heat. So the voltage of the input battery doesn’t matter as long as it’s above the minimum amount needed for the circuit, and the only relevant capacity value is mAh (and using Wh would actually be less than helpful). Nowadays, nice solid state electronics are cheap so we use DC-DC converters (which can Boost the voltage up or down depending on application) that adjust the output to be whatever the circuit needs, so a higher voltage battery will need less current draw, in which case we should all be using Watt-hours (or maybe Joules, if you’re the pedantic type… that’s the same as Watt-seconds). This is much more efficient. It’s also why you can use a single 1.5V AA alkaline battery for a nice LED flashlight (white LEDs have a voltage drop of 3.5 to 5V) instead of needing 3 AAAs, and it can drain all the energy out without a change in brightness over time.
So I guess tl;dr… our electronics used to be less efficient (make more heat instead of useful work), so we used mAh. Our better electronics can trade voltage for current to meet the needs of the circuits regardless of battery voltage, so we should use Wh now, as we do for large batteries.
I see a lot of people saying mAh is a scam. It most definitely is not.
To simplify matters i will Ah (Amp hour) and Wh (Watt hour).
I will start it simple, the very basic law of electricity, Ohm’s law: V= I * R. Meaning Voltage (measured in Volts) = Current (Amps) * Resistance (Ohms)
Since we have quite constant voltage noawadays, this form is more relevant I=V/R
And last in the introduction is the Watt: P=V * I … W = V * A
Since pretty much every phone sports a “3.7V” battery, it’s very relevant to compare them at mAh because batteries have traditionally been for Ah. A 3Ah battery basically means that a full battery will last for 1 hour if you draw a 3A load or 3 hours if you draw 1A. In this whole paragraph, voltage doesn’t matter. You could try this setup with a measure of current, never checking for voltage as long as you keep the load constant.
Cars and houses on the other hand use complex battery packs which are basically made from smaller batteries and the batteries can be arranged in SERIES or in PARALEL. If you arrange 2×3.7V batteries in series, the Voltage will double while the Ah will remain the same. You can basically drive bigger loads for the same amount of time. But if you arrange 2×3.7V batteries in paralel you get the same voltage output but you double the Ah so you can drive the same load for twice as long.
The bettery packs are arranged in such way to have a voltage output relative to the load, so a hybrid car might have an 80V battery pack while a full electric car will run it’s battery pack at 800V. House batteries generally use the same technology the car batteries but the problem is your house normally runs at 110/230V so you have smart and fast electronics that step it down or boost it up depending on your needs.
TL;DR voltage doesn’t count in Ah ratings but it does in Wh ratings
Small batteries are often single-cells (Lithium ion or LiPo, nominal 3.7V) whereas large batteries never are so with large batteries you have to use Wh otherwise if they used mAh, the amount of energy would be totally dependent on the arrangement of the cells (parallel vs series). Personally, I find the use of mAh annoying even when a single cell as the amount can vary depending on chemistry (high voltage Lipo can have 3.8V nominal, Lithium Iron Phosphate is increasingly common for stationary applications and is safer… has a 3.2V nominal voltage), and oftentimes we’re abstracting away the mAh of the cell anyway because we’re boosting from the 3.7V of the single cell (typical LiPo or non-LFP Lithium Ion) to the 5V of USB or whatever.
Plus, oldschool NiMH cells (typical rechargeable AA or AAA) have a nominal 1.2V, and a fully charged high voltage LiPo may be safely charged to 4.35V…
Back in the day, the lazy electrical engineering solution to making sure electronics got a consistent voltage from batteries was to slap a voltage regulator on the input. All that does is cap the voltage output at a certain value (that depends on the circuitry you need to power… and should be less than the voltage of an almost-drained battery under load), and any input voltage above that is thrown away as heat. So the voltage of the input battery doesn’t matter as long as it’s above the minimum amount needed for the circuit, and the only relevant capacity value is mAh (and using Wh would actually be less than helpful). Nowadays, nice solid state electronics are cheap so we use DC-DC converters (which can Boost the voltage up or down depending on application) that adjust the output to be whatever the circuit needs, so a higher voltage battery will need less current draw, in which case we should all be using Watt-hours (or maybe Joules, if you’re the pedantic type… that’s the same as Watt-seconds). This is much more efficient. It’s also why you can use a single 1.5V AA alkaline battery for a nice LED flashlight (white LEDs have a voltage drop of 3.5 to 5V) instead of needing 3 AAAs, and it can drain all the energy out without a change in brightness over time.
So I guess tl;dr… our electronics used to be less efficient (make more heat instead of useful work), so we used mAh. Our better electronics can trade voltage for current to meet the needs of the circuits regardless of battery voltage, so we should use Wh now, as we do for large batteries.
I see a lot of people saying mAh is a scam. It most definitely is not.
To simplify matters i will Ah (Amp hour) and Wh (Watt hour).
I will start it simple, the very basic law of electricity, Ohm’s law: V= I * R. Meaning Voltage (measured in Volts) = Current (Amps) * Resistance (Ohms)
Since we have quite constant voltage noawadays, this form is more relevant I=V/R
And last in the introduction is the Watt: P=V * I … W = V * A
Since pretty much every phone sports a “3.7V” battery, it’s very relevant to compare them at mAh because batteries have traditionally been for Ah. A 3Ah battery basically means that a full battery will last for 1 hour if you draw a 3A load or 3 hours if you draw 1A. In this whole paragraph, voltage doesn’t matter. You could try this setup with a measure of current, never checking for voltage as long as you keep the load constant.
Cars and houses on the other hand use complex battery packs which are basically made from smaller batteries and the batteries can be arranged in SERIES or in PARALEL. If you arrange 2×3.7V batteries in series, the Voltage will double while the Ah will remain the same. You can basically drive bigger loads for the same amount of time. But if you arrange 2×3.7V batteries in paralel you get the same voltage output but you double the Ah so you can drive the same load for twice as long.
The bettery packs are arranged in such way to have a voltage output relative to the load, so a hybrid car might have an 80V battery pack while a full electric car will run it’s battery pack at 800V. House batteries generally use the same technology the car batteries but the problem is your house normally runs at 110/230V so you have smart and fast electronics that step it down or boost it up depending on your needs.
TL;DR voltage doesn’t count in Ah ratings but it does in Wh ratings
Both kWh and mAh measure energy capacity. They are used in this way because of the units used in typical applications.
Small battery devices typically specify the current they draw.
Appliances typically specify the wattage they use.
If you know the current draw, mAh gives you the approx capacity in hours.
Hours = mAh/mA
If you know the wattage, kWh gives you approx capacity in hours.
Hours = kWh/kW
This is also related to convention. Power is sold in kWh, so large batteries that connect to grids use the same units for convention.
Both kWh and mAh measure energy capacity. They are used in this way because of the units used in typical applications.
Small battery devices typically specify the current they draw.
Appliances typically specify the wattage they use.
If you know the current draw, mAh gives you the approx capacity in hours.
Hours = mAh/mA
If you know the wattage, kWh gives you approx capacity in hours.
Hours = kWh/kW
This is also related to convention. Power is sold in kWh, so large batteries that connect to grids use the same units for convention.
Let’s start with some definitions, a logic approach, and then a math approach.
**Definitions/Laws**
Ohm’s Law: V = I R
V = voltage
I = current
R = resistance
Power = I^2 * R = energy (Joules) / second
**Logic approach**
Watts are a unit of power, so kWhr (kilowatt hours, or hours of kilowatts) is saying “I can provide with this much power for this many hours.” Makes perfect sense, right? You want to know how long a 10kWhr battery can power your 1 kW device? 10 hours.
Now, how long can the 10mAh battery power that same device? I have no idea. A (amps) is a unit of electrons per second… but it’s very misleading. Think of current as being the current of a stream and resistance as the resistance provided by a water mill. How big of a water mill is it? Telling me that the current can move at 1m/s is useless information if I don’t know how big of a water mill it’s turning at that speed. I can’t tell if it’s a 5th grader’s science project capable of powering a toy robot or the Mississippi River and capable of powering NYC, so I cannot compare power to current.
**Math approach**
kWh is telling you hours of X power, but how do you convert mAh into that? For this comparison, we can remove the hours from both of them to simplify things.
Now we have … current (the m is just milli as in millimeter, and the A is for amps, the unit current is measured in). I^2 * R = power, so we have no idea how much power that represents without the voltage of the battery or the resistance. In fact, since V = I R and we don’t know for sure the resistance (R) of the device that will be plugged into the battery, that current figure is absolutely meaningless. With no V and no R and V =I R, you can arbitrarily assign any value to I knowing that no matter what V is, there will be a mathematical solution to V = I R that will make “I” (current” correct. It’s a relic of back before conventions had really been set up to standardize things in a logical manner (just like us using gallons of gas and F in the US). It’s a little bit like asking “How strong are you?” and the guy responding, “I can hold something up for 5 seconds!” Hold what up? A feather? A glass of water? A car? 5 minutes is a unit of time, not strength.
**So why do marketers use it?**
Let’s say my battery sucks. It’s 10 kWh and my competitors is 30 kWh. Do you think I’m advertising my kWh? Heck no! But I can truthfully say it’s 45 mAh. A customer that has no idea what either of those means will see one is 30 and the other is 45 and hopefully conclude that my product is better.
**Isn’t that dishonest?**
Phineas says yes, yes it is.
Latest Answers