Why does high voltage result in less power loss than high current?

801 viewsOtherPhysics

Physics 2 student here. I’ve just learned that:

P=I*V

Using ohms law, V=I*R you can get the equations:

P=I^(2)*R and P=V^(2)/R

In these equation, power is proportional to I^2 and V^(2) respectively. This seems to show that power is squarely proportional to both voltage difference and current. Yet it’s stated that high voltage results in less power loss than high current. How can this be true when power scales by the same amount with a change in either.

The context of this is power lines and why they use high voltage.

Please ELI5. I’ve heard inadequate explanations in my textbooks, I can’t wrap my head around it.

In: Physics

15 Answers

Anonymous 0 Comments

In particular, you have to consider that while current I is a fixed amount, ***voltage V is relative.*** The P = V^(2)/R formula is based on the voltage drop **across the device.**

In other words, let’s throw out the low-voltage/high-voltage thing entirely. Simple circuit: V = 10 V source. Two 5 Ohm resistors in series. Total resistance is 10 Ohms, so current is 10 V / 10 Ohms = 1 A. Hopefully you agree with this.

The power loss through the first resistor is P = I^(2)R = (1 A)^(2)(5 Ohms) = 5 W. Same for the second. This is a total power of 10 W, which again agrees with what we expect (P = VI = 10 V x 1 A = 10 W).

If we use the voltage form of P = V^(2)/R, we **do not** calculate the first resistor’s power as (10 V)^(2)/(5 Ohms) = 100 / 25 W = 25 W. We have to use the **voltage drop across the resistor** which is only 5 V to get (5 V)^(2)/(5 Ohms) = 25 / 5 W = 5 W.

Going back to the low-voltage/high-voltage case: my bet is that you are incorrectly thinking that the V used in the V^(2)/R calculation is the input voltage. As we just saw **it is not.** It is the **voltage drop.** So the answer is that when you go from a lower voltage to a higher voltage with the same power, current necessarily drops (P = VI => V increase, P constant -> I decrease); which means that the voltage drop V = IR decreases, and the voltage on the other side of the line increases.

Voltage drop = voltage this side – voltage that side. Increasing voltage that side means that the voltage drop ***decreases***. So yes, V^(2) increases from the fact that the voltages are being increased, but it also **decreases** because of the reduced voltage drop across a resistive load from a lower current and the net effect is that losses are reduced.

Side note: this is a reason to work primarily in terms of current when analyzing these kinds of situations. Voltage analysis gets really tricky quite quickly, because you have to think in terms of the voltage drops.

You are viewing 1 out of 15 answers, click here to view all answers.