Why does high voltage result in less power loss than high current?

809 viewsOtherPhysics

Physics 2 student here. I’ve just learned that:

P=I*V

Using ohms law, V=I*R you can get the equations:

P=I^(2)*R and P=V^(2)/R

In these equation, power is proportional to I^2 and V^(2) respectively. This seems to show that power is squarely proportional to both voltage difference and current. Yet it’s stated that high voltage results in less power loss than high current. How can this be true when power scales by the same amount with a change in either.

The context of this is power lines and why they use high voltage.

Please ELI5. I’ve heard inadequate explanations in my textbooks, I can’t wrap my head around it.

In: Physics

15 Answers

Anonymous 0 Comments

Power loss on a transmission line is given by I^2 *R.

So minimizing current (I) results in minimizing power losses.

The reason for this is that wires get hot when you run current through them. The more current you run through them, the hotter they get. That heat represents energy that has been lost from the electrical system. (resistance also tends to increase with heat, making high currents a double whammy).

You are viewing 1 out of 15 answers, click here to view all answers.