How do transistors translate into computations?

269 views

I do not understand how transistors are used to compute, especially how immense amounts of them are used, how does what know the states of all of them?

In: 1

5 Answers

Anonymous 0 Comments

Transistors in various configurations can mimic the behavior of what we call logical “gates.”

Logical gates take, as input, logical values (True or False) and output a logical value (True or False). Examples include the NOT gate (which takes True or False and outputs the opposite value; True > False, False > True), AND gate (which outputs True only if both of its two inputs are True and False otherwise), and the OR gate (which outputs False only if both of its two inputs are False and True otherwise). There are others, but these are the basic ones.

Transistors can mimic the behavior using High and Low voltages to represent True and False, respectively. You can use a Transistor to act as a NOT gate by output a High voltage when the input is Low and vice versa. You can use Transistors in parallel or serial to mimic AND and OR gates, and all the other gates not mentioned.

So we can implement, electronically, logical gates, by interpreting High and Low voltages as True or False values. We can also extend this to math.

If you look at single bit binary addition we have the following possibilities:

0 + 0 = 00

1 + 0 = 01

0 + 1 = 00

1 + 1 = 10

If we interpret 1’s and 0’s as True and False, respectively, we can interpret this as a logical operation. The left-most digit of the output behaves as an AND gate (only outputs 1/TRUE if both inputs are 1/TRUE) and the right-most digit of the output behaves as what is known as a XOR (exclusive or) gate (only outputs 1/TRUE if either, but *not* both inputs are 1/True).

Since we can represent binary addition via logical gates, and since we can represent logical gates using transistors we can now perform binary addition with transistors.

And this extends to all sorts of mathematical operations (inverting, adding, subtracting, multiplication, division, comparison), logical operations (NOT, AND, OR, etc. on multiple arrays of bits), and other kinds of manipulations (shifting bits to the left or right).

Anonymous 0 Comments

This is a very large, vague, question. The short, somewhat vague, version of the part I know is:

Transistors can act as switches. These transistors can be combined to make gates, such as AND, OR, XOR, as well as flip-flops (registers, to remember things). The gates may use anywhere from 2 transistors to a dozens of transistors for a register.

We can combine these gates and registers to do things like add 2 binary numbers, subtract them, multiply them, compare them, and so on. Creating blocks that do things like this, we can create a CPU (central processing unit, a microprocessor) that can perform computations. (BEGIN: fetch an instruction; decode/compare the instruction; it’s an add, so perform the add; increment the instruction counter; goto BEGIN)

There are levels past this, where you talk about multiple copies of the CPU (multi-core) that operate somewhat independently, or multi-thread, where each core may be doing (kind of) 2 different programs at once, or adding cache memory (more transistors) to speed up the memory access time, or whatever else, because each of these adds more transistors to make it compute faster.

Anonymous 0 Comments

Transistors use something called Boolean Algebra to make decisions. They monitor input currents and generate a different output current in response. Check out [this link that explains how transistors work](https://www.explainthatstuff.com/howtransistorswork.html#calculate).

Anonymous 0 Comments

So, all binary computation relies on two states of each wire: basically “on” and “off”. The transistor itself is a very simple unit. It has two inputs and one output. If input B (“gate”) is off, the gate is open: state from input A proceeds to the output. If input B is on, the gate is closed: output is always off.

Turns out, using this we can start building blocks for binary logic. For example, we can make the “NOT” block that gives us “off” state when the input is on and “on” when the input is off. Just take one transistor, connect your input to the gate and make input A always on.

Using the “NOT” element we can make the “AND” element with two input signals (output is on when both inputs are on): one of the inputs is the “input A” of a transistor and the gate is controlled by “NOT second input”. In a similar way we can build the “OR” element (output is on when either of the inputs is on).

So, using only a few transistors each, we can make all operations of a binary logic. Now we need to turn it into computations. Well, it turns out that if we write numbers in binary form, we can calculate their sum using these operations on their digits in a certain way. So, with a handful of transistors, we can add two numbers together: build an electric circuit where numbers in their binary forms are sent to input wires (one wire per digit) and their sum appears on output wires. With some more effort we can subtract, multiply or divide.

Now we need some control over what we’re doing, so that we can program our processor. We build a circuit for each operation and then, together with the input numbers, we input a special number called the operation code. It is passed through a special circuit that does “if the code is 001, send input numbers to circuit 1; if the code is 010, send them to circuit 2 etc”. Some of this circuits can add numbers together, some can activate operations such as “store this number in memory at this address” or “access a memory cell with this address and output its contents”.

Done, we have a processor. A program is basically just a sequence of codes and arguments (that can be stored in the memory along with the data).

Anonymous 0 Comments

Basically we use a technique called abstraction that lets use use layers of complexity. Just like you don’t need to know the detailed thermodynamics of how an engine works to drive a car, you don’t need keep all the details of a computer system in your head when designing such a system. The best book to read on this topic is called Code: The Hidden Language of Computer Hardware and Software, by Charles Petzold.