Hey guys! I’m pretty young (15) and whenever I would get a new device for my bday or something, my parents always said that you needed to charge the device for 24 hours before you use it, and that you should always charge it up to 100% and use it until it hits 0% otherwise the battery won’t hold as much charge as possible. This kind of sucked because I of course wanted to play whatever it was right away instead of waiting until the next day, and I recently found out that you don’t have to do this on newer batteries but on older ones you do, which is how this “common sense” advice originated. However why did old batteries need to be charged for 24 hours before use, and used from 100% to 0% as it doesn’t seem like that should impact how much charge the battery can hold. Please ELI5.
In: 29
Old NiMh and NiCad batteries suffered from what’s called a “memory effect.” If you didn’t fully discharge the battery regularly, it would start to think the level you stopped at was its new “zero” and would slowly seem to lose charge capacity even though the battery was fine, just stupid.
So the full charge and discharge out of the box was to “set” what the max and fully discharged point of the battery were to make it so the memory effect would take longer to start.
Later generation NiMh chargers would fully discharge the battery before starting to charge it up again to prevent memory effect from taking hold, which took longer to do. There was a rule of thumb about using fast-chargers X number of times, which would only charge from the current level to 100%, then do a “full” charge to refresh the memory effect.
This is what your parents grew up with regarding rechargeable batteries, and was pretty common until the early 2000’s, when LiIons started showing up in laptops and cell phones. LiIon batteries changed all of this, but come with a different set of tradeoffs.
Edits: Added NiCad and revised a bit for clarity.
Latest Answers