When power systems were first introduced, there was no standard. Eventually a “war of the frequencies” broke out. Many companies were trying to find the best compromise of frequencies to run both motors (mainly for industrial purposes) and lights, plus being able to transmit the power over long distances. High frequencies lose their energy quicker than low frequencies, but at too low of frequencies, lights visibly flicker.
Most people found that somewhere between 50Hz and 60Hz is best to run motors and lighting systems while also allowing for long transmissions with the materials they had at the time.
From there, basically the system chosen by the largest company would become the standard because tool and appliance manufacturers would make equipment that would be bought and used by the most people possible.
In the US, Westinghouse was the winner, and their system was 60hz. In Europe, it was AEG and they had chosen 50hz.
As for the voltages, early Edison light bulbs required 55 volts of DC power to operate, and s 2 bulbs in series would need 110V. So it was for that reason that Westinghouse built his AC systems off of that same voltage.
However, higher voltages are easier (and cheaper) to transmit. A lot of testing was done with increasing the voltages of US grid systems until we found that 220V would have been ideal. But so many people had stuff on the 110V system already that it became usfeasable to change all the systems to 220V, so we kept it at 110 (actually, we send 220 to homes and then split it into 2 110V rails to run our stuff, with some exceptions like ovens and welders and such). Luckily for Europe, the US had learned these lessons before their grid systems were really taking off and so they basically just started theirs at the 220V range.
And lastly, because of system losses, they bumped those voltages up to 120V and 240V instead of sending out 110V and 220V.
Latest Answers