You don’t supply things with current (amps). You supply a voltage and the devices internal resistance will determine how much current it tries to draw.
When you see a power supply rated for something like 12V and 2A, that means it can *safely* provide up to 2A to a device. It’s happy to provide less if that’s all the device is asking for.
So in general as long as the power supply unit can supply the right voltage and *at least* the same current or higher as the device, you’re fine.
So in your example, yes you can use a power supply rated for 12V 2A with a device rated for 12V 1.5A That device will only draw 1.5A from your power supply which is less than 2A.
And just to make myself super clear, you ***cannot*** do the opposite. If you try to run a device that is rated for 12V 1.5A on a power supply that is only rated for 12V 1A then you will cause that power supply to overheat and possibly melt and/or start a fire.
Kind of.
If you supply your device with less current it probably just won’t work properly.
If you use a higher rated supply then the device should still only take 1.5A, as that’s all it needs. But if it’s poorly designed or has a fault then it may take the full 2A and that could damage it.
So while it will probably still work with a higher rated supply it’s not the best idea and could range from potential damage to your device to a safety/fire risk.
This gets especially important if you’re talking about charging batteries. Most modern batteries have intelligent charging systems.to handle voltage/current, but older ones may not and a daged battery can be very dangerous.
Everything below depends on voltage being correct.
Never mix different voltages.
A power supply can provide a set amount of power.
A device determines how much power is drawn.
If you overload a power supply, then it will overheat and break, and highly likely be a fire hazard.
You can use the biggest power supply in the world to run a single led diode with no issue.
Latest Answers