Why does high voltage result in less power loss than high current?

947 viewsOtherPhysics

Physics 2 student here. I’ve just learned that:

P=I*V

Using ohms law, V=I*R you can get the equations:

P=I^(2)*R and P=V^(2)/R

In these equation, power is proportional to I^2 and V^(2) respectively. This seems to show that power is squarely proportional to both voltage difference and current. Yet it’s stated that high voltage results in less power loss than high current. How can this be true when power scales by the same amount with a change in either.

The context of this is power lines and why they use high voltage.

Please ELI5. I’ve heard inadequate explanations in my textbooks, I can’t wrap my head around it.

In: Physics

15 Answers

Anonymous 0 Comments

**TLDR:** Think of it as “a distribution system that supplies high voltage and low current to the customer uses less total power than one that supplies high current and low voltage to the customer.” This effects what’s leaving your plant, but the customer doesn’t care about that – that’s your problem. As we will see in an example, in some contrived example, it might be both lower voltage and lower current from your point of view. 

There are two basic reasons why this can work. First, your system includes things like machines that use power to do things we care about, and things like wire – which consume power because they have non-0 resistance, which are necessary – but which we want to waste as little power in as possible. 

Further, voltage is naturally additive/subtractive over bits of a circuit, not constant throughout. Distribution is actually AC, but we’ll stay in DC, for simplicity – If your source supplies 10A and has 1000V difference between its input and output, and there’s a wire with resistance 10ohms to between your source some machine, you have spent V=IR=10*10=100V to get to the other side of the wire. That voltage is spent – gone, subtracted from what is available in the circuit. 

This leaves 900V left to be consumed by the machine then complete the circuit. This wire might be referred to as a “900V” wire because it can supply 900V. But it itself only ate 100 (and so only required your power plant to make 100V more than it would if wires had no resistance). 

Also note that your customer doesn’t care if you’re a 1000V or 10000V battery in the grand scheme of things, they care about the state of the circuit at their machine. Again, the rest is your problem. 

**Example:** Suppose you’re trying to get power from your power plant to a machine that needs 1000W of power. It’s got it’s own magic nonsense so that it doesn’t care what the voltage or amperage is, it will handle whatever. It just wants to be able to draw 1000W of power. 

We’re using DC current because it’s easier to talk about, so assume the wire to the machine has a resistance of 100 ohms, and the wire back also has a resistance of 100 ohms.  

**Case 1:** Let’s say you decide to get this power to the machine by using 10amps of current. That means that you need to supply 100V across their machine. P=IV, they need 1000W.

But also you have to cross the wire to the machine. Which has a resistance of 100 ohms. So the loss of crossing that wire is P=I^2 R= 10^2 * 100=10000W. Then you have to do it again to cross the wire back. Another 10000W. So that’s a total of 21000W spent to power the 1000W machine. 

Your total voltage drop is 100 for their machine, but you also lose voltage as you cross each segment of your wire: V=IR=10*100=1000V per wire. So you need to have a voltage drop of 1000+100+1000 =2100V. 

**Case 1 summary:** 10A, 2100V total (100V at machine), 21000 total watts spent to get 1000W at a machine you care about.  

**Case 2:** So suppose you decide that you’re actually only gonna give them 1A. That means you need to have 1000V across their machine (P=IV). 

Power for the wires: P=I^2 R = 1*100 = 100W. 200W total because two wires.  

Total voltage drop: 1000V at their machine, V=IR=1*100 per wire = 1200V. Total power spent 1000W at their machine + 200W lost to crossing your wires = 1200W.

**Case 2 summary: 1A, 1200V (1000V at machine), 1200W spent to get 1000W at machine. 

**Conclusion:** Either way, the customer gets his 1000W, so he’s good. But in the first case you actually generated 21000W, and in the second 1200W. The voltage that you spent crossing irrelevant wires also dropped, which I think is where your confusion came from – it’s only the voltage across the machine that increased. 

And of course the 19800W you saved by lowering current directly translates to less fuel spent at your power plant (or less solar panels that you needed to build, etc), so you’re saving a lot of money. But it’s the customers voltage that increased as a result of this, not yours.

You are viewing 1 out of 15 answers, click here to view all answers.