Physics 2 student here. I’ve just learned that:
P=I*V
Using ohms law, V=I*R you can get the equations:
P=I^(2)*R and P=V^(2)/R
In these equation, power is proportional to I^2 and V^(2) respectively. This seems to show that power is squarely proportional to both voltage difference and current. Yet it’s stated that high voltage results in less power loss than high current. How can this be true when power scales by the same amount with a change in either.
The context of this is power lines and why they use high voltage.
Please ELI5. I’ve heard inadequate explanations in my textbooks, I can’t wrap my head around it.
In: Physics
I^2*R “power” is typically shown as heat loss in a conductor. A big part of why circuit breakers in your home are sized (15A, 20A) is due to the heat generated based on Current and Resistance in your home wiring.
So! If you can knock down the current, and ramp up the voltage (think power grid), you can transmit power more efficiently.. with less heat loss.
Perhaps that is what the book is trying to say.
Latest Answers