Physics 2 student here. I’ve just learned that:
P=I*V
Using ohms law, V=I*R you can get the equations:
P=I^(2)*R and P=V^(2)/R
In these equation, power is proportional to I^2 and V^(2) respectively. This seems to show that power is squarely proportional to both voltage difference and current. Yet it’s stated that high voltage results in less power loss than high current. How can this be true when power scales by the same amount with a change in either.
The context of this is power lines and why they use high voltage.
Please ELI5. I’ve heard inadequate explanations in my textbooks, I can’t wrap my head around it.
In: Physics
That “V” in V^2/R is the voltage difference between the ends of a wire. It is not the rated or nominal voltage.
Say you have a 9 ohm load supplied by a 1 ohm wire. Nominally you supply 10V to the circuit. What happens is that 1V drop is seen in the wire and 9V over the load. Effectively the load outputs 9^2/9 = 9 W and 1^2/1 = 1W is lost as heat in the wire.
Say you want to deliver the same power at higher efficiency. The load is increased to 899 ohms and the wire remains at 1 ohm. Instead you use 90V supply. Now the voltage drop is 89.9V on the load and only 0.1V on the wire. The power at the load is 8.99W and the loss in the wire is 0.01W. So the efficiency now is increased delivering the same power to the load.
Latest Answers