For example, I have an antique portable television made by the Singer manufacturing company that plugs into US mains. However, it also has a 12v DC input and can run off of that. I also have a retro CCTV monitor that uses the same 120v mains plug and has a placard on the back that says it can use a 12V 12W battery. How is this possible? If an electronic device is designed to take 12v, wouldn’t 120v destroy the device? On the other hand, if a device is designed to accept 120v, shouldn’t 12v not be enough to operate it?
Edit: added a clarifying question
In: Physics
These devices are typically designed to run on the lower of the two voltages, so in this case 12V DC.
Inside the device you’re going to find a 120V AC to 12V DC power supply. So if you plug it into a 120V outlet, it runs on 12V internally, and if you connect it to a 12V battery, it also runs on 12V internally.
It’s also worth noting that for many electronic devices, 12V is likely going to be an intermediate voltage that only very few parts of the circuitry would actually run on. Something like the audio amplifier or LCD backlight may use 12V directly, but most of the electronics are going to run on something like 1.2V, 2.5V or 3.3V.
A large chunk of domestic devices actually prefer DC current. If it has an option to plug into the mains as well as an alternative option for DC then chances are that it has an in-built full bridge rectifier that converted AC to DC. If you use a laptop, that blocky thing that they call the charging cable is the rectifier.
I assume the inputs are separate plugs.
Usually the case is that the 120V in is transformed down with an SMPS to 12V.
The 12V input then just bypasses the SMPS. (Or runs a completely separate power supply system that converts the 12V in to whatever voltage it wants to work with internally.
That and a couple diodes or power switching to keep power from flowing between the two power supplies.
Latest Answers