Why does high voltage result in less power loss than high current?

817 viewsOtherPhysics

Physics 2 student here. I’ve just learned that:

P=I*V

Using ohms law, V=I*R you can get the equations:

P=I^(2)*R and P=V^(2)/R

In these equation, power is proportional to I^2 and V^(2) respectively. This seems to show that power is squarely proportional to both voltage difference and current. Yet it’s stated that high voltage results in less power loss than high current. How can this be true when power scales by the same amount with a change in either.

The context of this is power lines and why they use high voltage.

Please ELI5. I’ve heard inadequate explanations in my textbooks, I can’t wrap my head around it.

In: Physics

15 Answers

Anonymous 0 Comments

I think the misunderstanding lies is that with P = V^2/R the voltage here is the difference in voltage at then ends the wire, not comared to neutral. So while you double the voltage, the difference at the outer ends of the wire is not doubled.

You are viewing 1 out of 15 answers, click here to view all answers.