How does a laptop automatically charge a phone/ipad? If I connect the two, why doesn’t the phone charge the laptop instead?

285 viewsOtherTechnology

I have an iPhone, iPad, and a Mac and when I connect my phone or iPad to the Mac, it automatically starts charging. But why? Why not the other way around?

My thinking is that Mac’s have a bigger battery? But still what about that makes the Mac a charging source in the presence of an iPhone/ iPad?

In: Technology

5 Answers

Anonymous 0 Comments

The charging circuit of a laptop is not connected to the USB port in most cases. So the port on a laptop is not designed to take in power. In a USB connection, the laptop is almost always configured as the host device. The host device must provide 5V and and a certain amount of current to the voltage bus line per the USB specification. Any port connected to the laptop becomes a USB slave device and cannot (by the spec) contend with the host to provide power.

If the phone is connected to a peripheral device (not a laptop) then it becomes the host device and the peripheral will become the slave device.

How it does so is that there is a signal line used on the USB for the negotiation between host and slave devices.

Anonymous 0 Comments

Software in the power management of the phone forces the charging circuit to be one way. This means that the phones battery will only be able to receive a charge from the usb socket, and not go the other way.

Anonymous 0 Comments

Before USB, there was no real standard for what plugs and ports each different device had outside of keyboards/mice. So since there were so many unique devices out there that all needed to connect to computers, there was a standard developed called USB that would allow devices that needed to connect to a computer to have one port known as USB-B, and the computer would have the other side known as USB-A. The USB-B port is commonly referred to as a “Printer” port since that’s where they were first used.

The device that had the USB-A port would be the “Host”. It would supply 5 volts of power to “charge” any device that was connected, and it would also be the device that was in control of the USB connection. The device that had the USB-B port would be the “client” device that wasn’t in control, all it was supposed to do was accept power and accept commands from the other device. This worked for a REALLY long time.

But USB-B is kind of a big plug, and devices were getting smaller and smaller, so we needed something else. Then they brought about “Mini USB-B”. This was a slightly smaller plug, but things still worked in much the same way as before — The device with the USB-A port was in charge, and the device with the Mini USB-B port was supposed to accept power and commands.

But devices KEPT getting smaller, so they came up with “Micro USB-B”. And for a while, this worked flawlessly as well; It was just a smaller version of Mini USB-B. The device with the USB-A port was still in charge. But then things started changing and devices that were connecting to USB started getting smarter.

Eventually someone said “Hey, I have a portable device that’s basically a fully-fledged computer now. I want to connect a keyboard, a mouse, a flash drive, a game controller, etc. to it and still be able to use it” so they invented a new standard called USB On-The-Go or USB-OTG. It used the same Micro USB-B port as before, but with a special cable it could be adapted to a USB-A port and act as if it were a host device that would normally only have a USB-A port on it.

“Well, that’s really confusing”, said someone, “We need a new port that can do both things if it’s going to be like this”. So they invented a new standard. Because it was both USB-A and USB-B at the same time, they decided to call it “USB-C”. And that just stuck. New devices that could only act as a host could have USB-A ports, but devices that could act as both a host and a client could have a USB-C port. Eventually, the USB-C port became standard on ALL devices, not just ones that could do both, so there needed to be a new solution: Let the devices themselves figure it out. So this is where we are now.

How do devices figure out what to do? Devices that can act as both a host and a client have a chip inside of them. When they’re connected to another device, these chips communicate with one another over the USB-C cable to work out which device should do what. The chip in your phone knows that it’s a phone, so it will request power instead of send power most of the time. If there’s a device that needs power, it will work it out. Equally, the chip inside your laptop knows that it’s a laptop so it will send power instead of requesting power most of the time. If the laptop is receiving power, it will work it out.

In situations where both devices can both send and receive power, like if you connect two laptops or two phones together? Well, that’s not common so there’s no real standard so anything can happen, honestly. The chips inside each device will work it out. The end result is that one laptop or one phone will charge the other, but which of the two things happens is almost entirely random.

Anonymous 0 Comments

If you use regular old-school USB on either side, the USB standard especies that the primary devices with type A connector (the one on your computer) will always supply power, and the secondary device with a B connector (what people refer to as mini and micro USB, common on older smartphones) will always* receive power.

(There is one rarely-used exception, USB OTG, which allows a device with a B connector to send power to power small peripherals with a special adapter, that way you could do things like connecting a keyboard to your phone.)

If any one of the two sides has a type C connector while the other has one of the other ‘classic’ connectors, the type C end will figure out that the other device is only power send or power receive (almost all type C cables have a small chip to tell the devices what the cable can do, which includes what way power can be delivered and how much power).

If you use type C on both end, in theory both devices negotiate which way power is distributed. By default, computers will preferentially ask to be a power source, while phones will ask to be power sinks (to receive power). This can be reversed in software: some Android phones have a setting to allow the phone to send power through USB-C instead of receive it, useful if say if you’re charging your wireless headphones with a type C to type C cable.

Anonymous 0 Comments

The USB ports on the computer are by default always putting out 5v. When plugged in the phone will see that 5v and put itself into charging mode while telling the computer “hey I’m a high power device give me as much juice as you can”.

Many phones also support putting power out on their USB (especially Android phones support one phone charging another over USB-C), but it’s something the phone has to do itself.