[ELI5] How does voltage device according to resistance on a circuit?

385 viewsOtherPhysics

I’m in electrical school and we’re learning about series circuits and I was just randomly thinking about voltage drop across devices and was wondering how does electricity “know” to not have the voltage drop to zero at the first device in a series circuit and to drop the voltage according to its individual value of resistance compared to the total resistance of the circuit and divide it accordingly across every device on the circuit until it reaches zero.

In: Physics

4 Answers

Anonymous 0 Comments

It’s not that “it knows”, it’s how the physics works out. Lemme explain.

So let’s say we have a power supply that has a dial to select what voltage it should supply, and assume it is able to provide infinite current. Let’s also just say we set it to 10 volts. If you take your circuit and then connect it in series with the power supply, then the voltage drop across your entire circuit will be 10 volts and the power supply will output whatever amount of current is necessary (or rather, the circuit will restrict the current to be at a certain level). With our theoretical super power supply, this is always true assuming the circuit has no other power supplies connected. If you halve the voltage to 5 volts, the current going through the circuit will also drop by half, since the resistance of the circuit is unchanged.

Since one end of your circuit is at +10/5 volts and the other end is at 0 volts, there’s no point in the middle of the circuit that can possibly be at 0V. (assuming that circuit does not contain any additional power supplies like I mentioned before, and also assuming its only connection to ground is the one it shares with the power supply)

You are viewing 1 out of 4 answers, click here to view all answers.