How does an electronic device control whether its battery charges when connected to power?

184 views

After plugging my phone in to a power source, it will not begin to charge the battery until I unlock it. My laptop similarly can “ignore” the voltage coming through the power cord.

How do these devices reject/ignore/disregard the voltage coming through the power cord?

In: 3

3 Answers

Anonymous 0 Comments

All modern devices include something called a “charger controller”. This is a little piece of circuitry that regulates when/how the battery is charged. This is really important with lithium batteries (i.e. almost all modern batteries) because overcharging them is a Bad Thing so you can’t just connect them straight to voltage, you need something actually actively controlling the charge process.

So when you plug in the power cord it’s not “ignoring” the voltage, the charge controller knows it’s there, but for whatever reason it’s decided not to actually start charging the battery until some other condition is met.

Many new phones & laptops are smart enough to recognize their usage patterns and optimize the battery charging cycle (start, stop, rate) to maximize battery performance. My phone, for example, “knows” that I’ll plug it in each night when I go to bed and unplug it when my alarm goes off. That’s way more hours than it needs to optimally charge the battery so, when I first plug it in, it’ll just say “OK, power’s here…I’m gonna wait a few hours then start charging”. It times it so that it finishes charging a half hour or so before my alarm goes off.

Likewise, my laptop is almost never off battery power, so I’ve got it set to maintain the battery at partial charge (extends the battery longevity). When I plug it in, if the charge controller sees that the battery is full enough, it just won’t charge it further.

Anonymous 0 Comments

We became really good at doing DC voltage conversion, typically using buck converters (to reduce voltage) and boost converters (to increase voltage). Think >90% efficient.

Why does it matter? Well, if you want to charge a battery, simply apply a higher voltage than the battery provides, and the current and therefore energy will simply flow towards it. If you want to discharge (aka. use it) simply apply a voltage that is lower than the battery provides, and the current and therefore energy will simply flow back out of it.

How does that voltage conversion works? The simplest one is to double or halve the voltage: use transistors to connect two condensators in series to the input wires (thus adding the voltage), then disconnect them and reconnect them to the output but this time in parallel (thus adding the current, but halving the voltage). You do this with high frequency enough and it looks like a lower voltage power source. It’s also symmetrical in the sense that if the output has higher voltage then your condensators get charged there and discharged in the input side. In practice it’s more efficient to step down the voltage so we tend to have a battery that has higher voltage (19V) than the components of the computer (12V, 5.5V, 3V and 1.1V).

Anonymous 0 Comments

At its most basic, a **transistor** is a simple electronic component which is essentially a switch, it will only close the circuit and allow electricity to flow from an input (the *source*) to the output (the *drain*) when another input (the *gate*) is powered. And that gate can absolutely be controlled in software, that’s fundamentally how a computer works.

Modern electronic devices will be far more complicated than a single transistor for their charging circuits, but the basic principal is there. It would be bad design to hardwire the battery to the charging port with no controls in between, one main reason being if you keep blindly charging a full lithium ion battery it will explode. So charging circuits have been a necessary thing pretty much as long as we’ve had rechargeable batteries.