Physics 2 student here. I’ve just learned that:
P=I*V
Using ohms law, V=I*R you can get the equations:
P=I^(2)*R and P=V^(2)/R
In these equation, power is proportional to I^2 and V^(2) respectively. This seems to show that power is squarely proportional to both voltage difference and current. Yet it’s stated that high voltage results in less power loss than high current. How can this be true when power scales by the same amount with a change in either.
The context of this is power lines and why they use high voltage.
Please ELI5. I’ve heard inadequate explanations in my textbooks, I can’t wrap my head around it.
In: Physics
The power loss along a conductor is due to resistance. This manifests as a voltage drop along the conductor. So which equation to use depends on what you’re able to measure or control.
V = IR can tell us how much voltage is lost along the conducting path. This turns into a power loss because P=I*R, so P=I^2 *R.
P=V^2 /R gets us the same answer if we’ve measured the voltage drop and the path resistance.
Where this gets interesting for your question is to explore power lost from a fixed amount of delivered power (e.g. 10000W) and fixed resistance (e.g. 0.1ohm) for various combinations of current and voltage.
0.1ohms with 100V and 100A at the load means 100V+(100A^2 *0.1ohm)=1100V at the input.
Bump that up to 1000V and 10A at the load, 1000+(10^2 *0.1)=1010V at the input.
Latest Answers