Physics 2 student here. I’ve just learned that:
P=I*V
Using ohms law, V=I*R you can get the equations:
P=I^(2)*R and P=V^(2)/R
In these equation, power is proportional to I^2 and V^(2) respectively. This seems to show that power is squarely proportional to both voltage difference and current. Yet it’s stated that high voltage results in less power loss than high current. How can this be true when power scales by the same amount with a change in either.
The context of this is power lines and why they use high voltage.
Please ELI5. I’ve heard inadequate explanations in my textbooks, I can’t wrap my head around it.
In: Physics
Two things that I haven’t seen anyone mention yet is that 1) we use I²R for power loss because current is what causes things to heat up, not voltage; and 2) across a piece of wire that has virtually 0Ω, you’re not going to find any voltage drops to be able to calculate V²/R, but you will find current with which to calculate I²R.
For 1), think about it. Say you have a series capacitor on a simple DC circuit. If you let it charge over time, eventually the capacitor will act like an open circuit (this is its steady-state response), since it basically is just two metal plates with a non-conducting medium in the center. Once it acts like an open circuit, current will no longer be flowing through the circuit, but the voltage across the capacitor will be the same as the power source. You’ll have, I dunno, 12V across it, but no charges will be moving anywhere. Is it consuming power if there is no movement of charges? Will anything heat up if there is no current of electricity releasing energy? No, it won’t be consuming power and thus won’t be heating up, even though its voltage is 12V. Once you turn off the power supply and the capacitor starts discharging, and current starts flowing through the circuit again, THEN there will be power consumption again. This is also why having a AAA battery in your hand doesn’t cause it to heat up; it’s not doing any work, which means it’s not using any energy, and thus there is no energy dissipation, no power is being used.
For 2), you know that power is measured in Watts, which is the same as Joules/second. When charges are moving, they are using energy to move. The unit of energy is Joules. As charges move, they interact with their medium and transfer some of their energy into it, which is then released as heat. If too many charges are flowing through, then the heat released increases. Current is just the movement of electric charges, so a higher current means that you have a higher number of electric charges in motion, and the more electric charges you have in motion, the more energy they transfer to the wire, which gets released as heat. Resistance is the capacity of something to resist the movement of charges across it. Generally, the higher the resistance of a component, the more it resists the movement of charges across it. Because charges need a larger amount of energy to be able to “punch through” the resistive component, you won’t see it consume power/heat up until there is enough difference in charges across the component (or voltage), in which case the current will “punch through” and it will light the component on fire instantly, as too much energy was used at once. Therefore, a good measure of power is then I²R.
Latest Answers