Physics 2 student here. I’ve just learned that:
P=I*V
Using ohms law, V=I*R you can get the equations:
P=I^(2)*R and P=V^(2)/R
In these equation, power is proportional to I^2 and V^(2) respectively. This seems to show that power is squarely proportional to both voltage difference and current. Yet it’s stated that high voltage results in less power loss than high current. How can this be true when power scales by the same amount with a change in either.
The context of this is power lines and why they use high voltage.
Please ELI5. I’ve heard inadequate explanations in my textbooks, I can’t wrap my head around it.
In: Physics
In simple terms, conductors are not perfect except in theory. Every real conductor has some amount of resistance, so we can consider every conductor a resistor.
The resistance is resisting the flow of current, not the pressure of voltage.
Since every conductor is actually a resistor, if we power a resistor with a conductor then we accidentally create a voltage divider.
Every resistor along the path, including the accidental resistors, will radiator power as heat, and this includes the accidental resistors that we call conductors.
When the system is changed from a lower voltage system to a higher voltage system then the intentional load resistance must be increased to transfer the same amount of power. This change also lowers the amount of current.
Meanwhile, assuming the conductors are the same same as above, the accidental resistance remains the same while the intended resistance increases.
This changes the voltage divider ratio, which causes a higher voltage drop across the intended load, and a lower voltage drop across the accidental resistor in my conductors.
Now we have a lower voltage _and_ a lower current across the conductors, the accidental part of the voltage divider circuit. Using P=IE, we see that the power loss from the conductors is reduced.
Latest Answers