Eli5: How does electricity = data?

218 views

We have electronics that can do a multitude of incredible things. I understand that data has to do with the binary system, but I can’t get my head around how electricity can be transfered into data… and how that data can then generate images and render graphics, etc. I mean, wut!? Plz hlp, brain cannot compute.

In: 5

5 Answers

Anonymous 0 Comments

The most important device in a modern computer chips are called MOSFETs which are a class of transitor. Although several other devices ate also used. There are many different designs for MOSFETs and depending on how designed they can have different operational characteristics which is why they’re popular. In the context of computer chips the types of MOSFETs used are designed to operate as electrinically controlled switches with very fast on/off switching time.

This means that a small current can be used, via a MOSFET to turn on our off a much larger current.

The simplest analogy for this might be a lever used to control the hydraulic cylinder on a log splitter. The control level attaches to an oil valve that sends high pressure hydraulic oils to on and if the cylinder or the other. This means a small force on the valve lever can control a much larger force on the splitting wedge.

Therefore one switch can be used to turn on and off a dozen, or perhaps even a hundred other switches, and it can also turn them on and off extremely quickly. That is, in less than a nanosecond.

So, using one switch to turn on and off another seems a bit silly and redundant. But it turns out if you use several hundred switches and wire then together in a particular way, your can start to do some pretty interesting things like add/subtract, multiply and divide binary numbers, or shuttle information around through a single wire by turning current inn amd off again in a specific order, like morse code.

What does any of this have to do with binary, logic, or ones and zeroes.

Well, using ones and zeroes is just a convenient writing notation. A computer program doesn’t really care what a zero is. Computer programs only care about whether current is ON or if it is OFF in a particular wire, pin, or circuit. This is similar to morse code but not quite the same, as in morse code, brief pauses are understood to mean spaces between letters, while a computer would interpret a pause simply to be OFF.

In binary we use 1 as a convention to represent “on” and a 0 to represent “off.”

Now, one of the neat properties of binary is that you can use a series of 1s and 0s to represent more familiar decimal numbers that you’re used to such as 14 (=00001110 in 8-bit format. Binary-decimal conversion is a topic I won’t go into.

So lets say you have one of those flat parallel signal ribbon cables with 8 wires, lets call them A-H.

If you wanted to input the number “8” into your electronic device, you could do something like:

Wire A = 0V

Wire B = 0V

Wire C = 0V

Wire D = 0V

Wire E = -3V

Wire F = -3V

Wire G = -3V

Wire H = 0V

Although technically speaking you’d need to have a 9th wire as a common drain or ground cable so you have:

Wire J = +0.2V.

This is at a slightly higher Voltage so current will flow out through the common line not back through A-D for example, which might be interpreted as them also being “ON”

You are viewing 1 out of 5 answers, click here to view all answers.