Household electricity is delivered at different voltages and frequency in different countries around the world. How did these multiple standards arise? Is one system better than others?

812 views

Household electricity is delivered at different voltages and frequency in different countries around the world. How did these multiple standards arise? Is one system better than others?

In: Engineering

4 Answers

Anonymous 0 Comments

[Utility frequency](https://en.wikipedia.org/wiki/Utility_frequency) determination was mostly a thing with generator design. Lower frequencies were cheaper to build a power plant, higher frequencies were cheaper for transmission and lighters. So when they build a power plant they would determine who would use most of their power, and then picked a frequency that optimized cost. Back then they were using 16 2/3Hz and 25Hz mostly, though 40, 50, 60, 133 1/3 were all in use. They started to move to higher frequencies as smaller loads (like street lighting really worked a lot better at high frequencies). Eventually they wanted to connect, and they standardized, and picked something, in the US they picked 60Hz because Westinghouse Electric thought it was close enough to 50Hz but lights worked better than 50Hz. 50Hz being used most everywhere else because a 16 2/3 Hz or 25Hz power plant can be upgraded to 50Hz by changing the poles on the generator (doesn’t require a complete redesign or changing how it’s operated).

The 120V vs 240V thing is different, 240V needs less copper, so places like the US selected 120V to reduce shock risks, largely because power isn’t the reason you pick wire size, making sure the power cable is big enough to not break is the reason. So in practice, 240V and 120V wires are the same size, you really don’t save anything.

Anonymous 0 Comments

The 50 and 60 Hz thing was just something that happened.
Different power plants produced power at different frequencies, but as local grids were merged into larger grids, they all had to be brought to the same standard.

For power transmission, more voltage is better, since it means you can transmit more power with less current – that is the amount of electrons flowing through a cable. More current means more power being lost to resistance before it even gets to the plug, and that lost power can heat the cable and potentially lead to an electrical fire. That is why you can safely draw twice as much power from a 220 or 230 V outlet compared to a 110 V outlet using equal cable diameter.

However, higher voltage also means that cables and plugs have to be insulated better, since the higher voltage can jump a longer distance. Higher voltage plugs are usually built in a way that the contacts are recessed into the wall, like [this British](https://upload.wikimedia.org/wikipedia/commons/thumb/3/3b/Moulded_and_rewireable_BS_1363_plugs.jpg/220px-Moulded_and_rewireable_BS_1363_plugs.jpg) or [the German model](https://upload.wikimedia.org/wikipedia/commons/e/e4/Schuko_standard.jpg). This is to prevent that the plug makes contact while there’s still blank metal sticking out of the wall.

Edit: It’s worth noting that regardless of what kind of power you get at the wall, the power transmission itself happens at much higher voltage. There’s also a great series on [YouTube](https://www.youtube.com/watch?v=v1BMWczn7JM&list=PLTZM4MrZKfW-ftqKGSbO-DwDiOGqNmq53) explaining how the grid works if you want to learn more.

Anonymous 0 Comments

Somewhat haphazardly is the answer to how. It was really what was thought best at the time early on, and developing countries tended to use the system from their colonizers or whomever was hired to come in and advise on the electrical infrastructure.

As to which is better, that is a subjective thing. Better at what and how? For example Europe uses a higher voltage than North America, but the original reason was to use smaller wires and so less copper. North America has lots of copper, so those countries saw no problem with thicker wires.

Source: Electrical Trainee

Anonymous 0 Comments

A little bit of technical history:
Before there were modern safety systems such as circuit breakers and RCDs the voltage level always was a compromise.
A lower voltage means higher current and therefore higher risk of burning your house down.
A higher level reduces the risk of burning, but also increases the risk of electrocution.
Bear in mind that 230v cables can still burn down and 110v can give you a deadly shock.
But as i already mentioned modern technology reduced all these risks down to a level where you just keep using what you’re used to.