Why does high voltage result in less power loss than high current?

945 viewsOtherPhysics

Physics 2 student here. I’ve just learned that:

P=I*V

Using ohms law, V=I*R you can get the equations:

P=I^(2)*R and P=V^(2)/R

In these equation, power is proportional to I^2 and V^(2) respectively. This seems to show that power is squarely proportional to both voltage difference and current. Yet it’s stated that high voltage results in less power loss than high current. How can this be true when power scales by the same amount with a change in either.

The context of this is power lines and why they use high voltage.

Please ELI5. I’ve heard inadequate explanations in my textbooks, I can’t wrap my head around it.

In: Physics

15 Answers

Anonymous 0 Comments

Impedance isn’t a fixed value. If you try to pump more and more current through a wire you start to get eddy currents and increased loss (which results in more power consumption/heat, and heat increases impedance for a very bad feedback loop).

The way they would counter that is to increase the diameter of the wire – but a wire capable of handling 1000 times the current is going to be over an order of magnitude larger in diameter. Imagine trying to suspend a set of 3 “wires” that are basically a 20″ solid aluminum pipe from those pylons. Completely unfeasible.

So we just increase the voltage.

You are viewing 1 out of 15 answers, click here to view all answers.