Physics 2 student here. I’ve just learned that:
P=I*V
Using ohms law, V=I*R you can get the equations:
P=I^(2)*R and P=V^(2)/R
In these equation, power is proportional to I^2 and V^(2) respectively. This seems to show that power is squarely proportional to both voltage difference and current. Yet it’s stated that high voltage results in less power loss than high current. How can this be true when power scales by the same amount with a change in either.
The context of this is power lines and why they use high voltage.
Please ELI5. I’ve heard inadequate explanations in my textbooks, I can’t wrap my head around it.
In: Physics
Because the power lost in the wires as heat is I^2 *R, so higher current means hotter wires. Resistance increases with temperature as well which makes it worse.
One way to think of it, V = IR gives you the voltage drop across a resistor R from current I. Smaller I means a smaller voltage drop across that resistor, which would lower V^2 /R, the power transmitted through the resistor. Remember in these equations, V is not the voltage the resistor in the circuit is at, but rather the voltage change across it.
For an overall circuit, P = IV where I is the current and V is the source voltage – ground. But that’s a different power from what is just being consumed by the resistance in the wires.
Latest Answers