How/why is it that two batteries with the same voltage may preform at different levels of power based on their amp hours? For instance, a 2Ah battery powering a weed eater at full charge, and the weed eater not working as powerfully as it would with a fully charged 4Ah battery?
Forgive me, I’m new to hobbies involving power tools but I’m excited to learn!
In: 4
The amp-hours rating is a measure of capacity. 40 Ah would mean that the battery is able to deliver 40 amps of current for one hour (or 20 amps for 2 hours, 10 for 4, etc.)
Now *power* measures the rate of energy use and is equal to volts times amps. The smaller capacity battery will obviously not last as long, but also generally may not be able to deliver as much *instantaneous* current as the larger battery either. Say the 40 Ah battery might be able to deliver a peak current of 80 amps (if this were sustained it would drain the battery in 1/2 hours), whereas the 20 Ah battery might max out at 60 amps, say. If the tool needs more than 60 amps instantaneous power, the smaller battery will not be able to deliver it. Its voltage will (temporarily) drop and the tool will not operate at full power.
Edit- I just realized I multiplied your example batteries’ capacities by 10, but it doesn’t matter -the principle is the same
Latest Answers