Eli5: Computers can calculate based on instructions. But how do you teach computers what does it mean to add something, multiply, divide, or perform any other operation?

966 views

Edit: Most of the answers here are wonderful and spot on.

For those who interpreted it differently due to my incorrect and brief phrasing, by ‘teaching’ I meant how does the computer get to know what it has to do when we want it to perform arithmetic operations (upon seeing the operators)?

And how does it do it? Like how does it ‘add’ stuff the same way humans do and give results which make sense to us mathematically? What exactly is going on inside?

Thanks for all the helpful explanations on programming, switches, circuits, logic gates, and the links!

In: 583

43 Answers

Anonymous 0 Comments

Computers can only add.

Subtraction is adding a positive to a negative.

Multiply is adding multiple times.

Division is adding a positive to a negative multiple times.

Computers use bit patterns called “words” to represent operations. When a computer fetches a “word” it has all the info it needs to tell the computer how to manipulate the pattern.

So for example a simple 32-bit instruction set with fixed length operation codes has words that are 32 bits in length. The first 4 bits might tell it to add, sub, ect. Then the next bits will be the registers to and values. The but patterns just switches and the computer just blindly does what the pattern tells it to do.

You are viewing 1 out of 43 answers, click here to view all answers.