Why are volts used as bits in a computer and not charges? How do we read volts, write volts, etc.? Wouldn’t using charges be much simpler?

616 views

Why are volts used as bits in a computer and not charges? How do we read volts, write volts, etc.? Wouldn’t using charges be much simpler?

In: Technology

5 Answers

Anonymous 0 Comments

Measuring voltage is measuring charges

You can count bonus electrons one of two ways. You can run them through an Ammeter and see what current you got which will tell you how much total charge flowed, or you can use a voltage meter to measure the voltage created by the charge.

Measuring your charges by measuring the current moves them from their source and destroys the original data so this isn’t good.

Measuring voltage does not corrupt the original charge grouping so you can measure it multiple times and still have the same value, this is good.

When you’re dealing with something like RAM cells, measuring voltage pretty directly measures charge, they’re just off by a scale factor (capacitance). We measure the voltage to see if its above a certain threshold, and if it is then we feed the High level voltage to the rest of the circuitry.

For MOSFET based logic circuitry, you don’t actually care about charges at all. Its completely voltage based. Did you apply enough voltage to turn on the transistor? Yes? Great, its going to turn on the downstream ones. You could look at gate charge, but gate charge varies depending on the size of the transistor while voltage is constant and what really determines the behavior of the transistor

All of our logic circuitry was built around using and measuring voltage because its easy and non-corrupting of the original data. You could make a system that worked on measuring charge and current using some BJT based logic, but you would likely have high power consumption. We use a CMOS layout in modern ICs because circuit blocks that aren’t switching use effectively zero power.

You are viewing 1 out of 5 answers, click here to view all answers.