What’s the difference between an input of 110-240V 1.5A and 2A on a power supply?


I’m looking to buy a power supply which output these qualifications: 24V 5A.

I’m looking at 2 different ones and one of them has a different input outage than the other at 1.5A, while the other has 2A.

In: Technology

The difference is 0.5 Amperes. One of the supplies is less efficient than the other, and thereby draws more current.

Alternatively, they could be nearly the same efficiency, but they specify the current draw in different ways. One could be quoting average and the other one quoting peak, for example.

The input rating is the maximum that it is designed to draw. Your supply is a switch mode supply. The actual current will change with load, input voltage and efficiency. Your supply output is 120 watts maximum (24V x 5A). The input power at full load will be higher due to conversion losses. Many supplies can accept a wide range of AC voltage inputs, typically from 85 to 265 VAC. Lower voltage AC will cause it to draw more current than higher voltage. **The input current can be specified at the lowest voltage (highest current) or at a nominal voltage like 120 or 240 VAC. **